Solved

Difference between Resolution and Accuracy

  • 1 April 2022
  • 2 replies
  • 123 views

  • Single Emitter
  • 0 replies

In several optics specifications of optical systems like telescopes and projector systems, they have talked about resolution and accuracy of the system separately. Please explain how both are different and how to analyze it in designing.

For example, resolution is given as 0.5cm and accuracy is given as  ± 0.15cm within ±20cm.

icon

Best answer by Jeff.Wilde 5 April 2022, 06:52

View original

2 replies

Userlevel 6
Badge +2

We have a nice article about resolution here: https://support.zemax.com/hc/en-us/articles/1500005490501-Resolution-of-diffraction-limited-imaging-systems-using-the-point-spread-function that may help.

Userlevel 7
Badge +3

I like the Huygens PSF resolution analysis described in the KB article cited above.  In fact, it can be readily extended to investigate the impact of relative phase between the two point sources.  

For an ideal aberration-free imaging system, two mutually coherent point sources separated by the Rayleigh distance will combine to form an intensity profile in the image plane that depends on their relative phase.  See, for example, Intro to Fourier Optics, 4th Ed., by Goodman:

 

In OpticStudio, this can be replicated by using polarization analysis in a two-configuration setup, where the location and phase of each point source are defined separately.  For the 4X microscope described in the KB article, it’s easy to find the following results:

 

Note the asymmetry in the pi/2 case -- i.e., the case in which the point sources are in quadrature.  This asymmetry is a consequence of aberration.  The sensitivity to phase fluctuations (aberrations), and hence the manner in which the two PSFs locally interfere, is maximum in quadrature.

Reply