Skip to main content

Hello,

I read with interest the KB topic by Mark Nicholson about Image simulation within Zemax/OS. According to the article, “ The PSF grid also includes the effects of polarization and relative illumination”. I assume this type of information is obtained by setting polarization, and the software computes the Huygens PSFs.

I wonder whether you could be a little bit more specific about how the image simulation takes into account the polarization aberrations whith this PSF grid? Does Zemax go over a similar method as the one outlined in Brickenridge, Lam, and Chipman paper, with the building of a 4x4 Mueller matrix (the so-called Point Spread Matrix)?

Thanks for your feedback,

Ben

I think it just calculates the array of Huygens PSFs with the ‘Use Polarization’ switch on, so you can use the Huygens PSF feature itself with polarization on and off to see the differences it makes.


Ok, I understand Zemax uses the polarization setting (Jx/Jy and Phasex/y) and does the proper computation considering pupil phase and amplitude aberrations for that specific input polarization state. This supposes you know the polarization state of the input scene and it is set before running the computation (and it is likely constant over the imaged scene). In real life, the polarization state can change from image point to image point so it may be useful to retrieve the PSF as function of Jones parameters (or some kind of point spread matrix). I was wondering whether you had an idea of how would you access that kind of information using Zemax? The idea behind it being able to predict the Huygens PSF considering a random user input polarization state (Jx, Jy, Px, Py) at field locaton Fx, Fy and wavelength W.

Thanks in advance for any lead,

Regards


You might want to look at the Partially Coherent Image Analysis, as the coherence pof the image is usually what modifies optical performance. Other than that, tell us more about the scene you’re viewing. It may be best to just brute-force it with non-sequential mode using multiple sources to model the polarization states.


Thanks again for your feedback.

I’m doing End to End image simulation of an optical instrument. The idea is to somewhat expand the image simulation possibility offered by the specific Analysis tool within Zemax/OS. Now the input scene I’m observing has targets that exhibit specific polarization, that varies across the image. The idea is to build a PSF library/grid accounting for polarization aberrations (amplitude/phase) to check how scene polarization may affect the performances of the instrument. In a  first approach I could consider the input scene presents linear polarization only, with varying amplitude and direction.

The thing is that I would want to re-use the PSF library to produce a series of images based on a series of input images, that do not show identical distribution in polarization. A brute force approach would be to compute a specific set of psf for each image separately, which would require a lot of call to Zemax API. On the other hand, if I’m able to build the Mueller PSF matrix of the instrument, I could use this single library (computed only once) to generate/simulate the response of the instrument to a number of input scenes, whatever the input polarization would be.


Reply