A simple unit conversion from [1/microns] to [nm] gets you close. For instance, I compute the resolution of the 12 [1/microns] curve to be 83 nm; the others work out similarly.
The factor of 2 is troubling and I can't explain it. I initially thought it had to do with sampling and the Nyquist criterion, but I don't see how that could apply here.
The Wikipedia page on the Optical transfer function (OTF) is a pretty good reference and computes resolution from the OTF cutoff frequency using a simple unit conversion, like I do above.
The factor of two may be due to there-and-back - it appears in all radar work. A rule of thumb is that the change in wavenumber times the resolution (actually the dot product - they are both vectors) is equal to 2 pi. The change in wavenumber in this case is 2 pi times the bandwidth, with a factor of 2 for there-and-back, so you get resolution = 1/(2*bandwidth)
I expect it will be worse than that because I would usually take the 3 dB bandwidth, which I think is narrower.
Adding to the colleagues above you can get an expression of the resolution as a function of the wavelength by following the link:https://www.quora.com/How-do-shorter-wavelengths-of-light-provide-higher-spatial-resolution-images-at-smaller-scales
According to the formula given in the link if the numerical aperture NA=1 , the resolution R= 0.61 lambda,
with lambda= 1000/12 nm= 86 as given by the colleague Milo, then R= 51.6 nm which is very near from your measured value.