Solved

Fourier diffraction through a tilted slit

  • 18 April 2022
  • 1 reply
  • 203 views

Hello,

I am trying to reproduce Figure 3 from Ganci 1981 in OpticStudio (paper and model attached). Unfortunately, try as I might, my simulated diffraction pattern stays horizontal even when tilting the slit (i.e. there is no observable bend). Would you happen to know if this is a OpticStudio issue, or whether my model is not setup correctly? Thank you in advance.

Kind regards,
Yannis

icon

Best answer by Jeff.Wilde 22 April 2022, 18:49

View original

1 reply

Userlevel 7
Badge +3

Hi Yannis,

Very interesting paper and phenomenon.  Based on the way POP is implemented in OpticStudio, I don’t think it is possible to model this effect of a curved diffraction pattern from a slit. 

From the analysis provided in the paper, we see that the “curving” of the diffraction pattern arises, at least mathematically, from the linear phase ramp term in the incident field.  This term, when included in the far-field diffraction integral, produces a delta function in angle space that constrains the y-direction (vertical) portion of the diffracted field.  The net result is a diffraction pattern that resides along the section of a cone. 

 

 

Note that this integral is evaluated analytically in the paper.  In general, it would be very difficult to do this numerically because of the spatial sampling required to properly capture the linear phase term (which, due to the y/lambda term in the phase, could easily span thousands of waves if the beam size in the y-direction corresponds to thousands of wavelengths).  But of course POP only uses numerical evaluation of the diffraction integrals.  So, POP is designed to remove these types of phase-tilt terms from the diffraction integral by tracking the beam along the chief ray, i.e, by using a local coordinate z-axis that always points along the chief ray direction.

That being said, in POP, simply tilting the slit as you have done in your model will not alter the phase of the beam in any way to begin with.  It’s my understanding that the phase imparted to the beam by any optical component surface is found from tracing probing rays.  Tilting the slit doesn’t alter the relative phases of these probing rays.  Moreover, any attempt to manually include a linear phase ramp across the slit will simply cause the chief ray to be deflected in accordance with the phase gradient.  And again, since the POP local coordinate z-axis always tracks the chief ray, the phase ramp is effectively removed from the diffraction integral, making this integral much more amenable to numerical computation -- but the result is that this particular phenomenon of a curved diffraction pattern cannot be observed (at least as far as I can tell).

Regards,

Jeff

 

 

Reply