Skip to main content

I have a layered coating that I made a coating file for. It is defined through material index vs wavelength and defining the layers, material, and thickness for each layer. Currently, I am using a tolerance analysis to see the effects of variation of the thickness and index of said coating, and am seeing nearly no sensitivity to changes in layer thickness or index, even with high orders of magnitude of change to thickness. This is not reflective of the real-world system data that currently exists.



 



Is there a good place to start in debugging this? I'm assuming ther has to be an error in the analysis somewhere.

I'd start with the Coatings tab of the surface properties dialog:





And modify layers by hand while checking the various polarization analyses or using the CODA (coating data) operand.



The other thought I have is what the tolerancing criterion you are using is. If you're looking at RMS spot, wavefront, or contrast, it may be that these are pretty insensitive to coating changes. Coatings will affect ray intensities and phases, but not their landing coordinates.



- Mark



 


I have a layered coating that I made a coating file for. It is defined through material index vs wavelength and defining the layers, material, and thickness for each layer. Currently, I am using a tolerance analysis to see the effects of variation of the thickness and index of said coating, and am seeing nearly no sensitivity to changes in layer thickness or index, even with high orders of magnitude of change to thickness. This is not reflective of the real-world system data that currently exists.

 

 

 

 

Is there a good place to start in debugging this? I'm assuming ther has to be an error in the analysis somewhere.

 

I got the same issue. Did you ever find the solution?


Reply