People & Pointers
Use this space to show off your skills, introduce yourself, or to chat about the latest in the world of optics.
- 190 Topics
- 804 Replies
To define an acceptance angle, you may use filter strings or define a table coating. For the first one, you can use the following filter strings: The Setup Tab > Editors Group (Setup Tab) > Non-sequential Component Editor > Non-sequential Overview > The Filter String   So for example, if you want to select only rays incident at an angle of less than 10 (cos(10) = 0.984808) degrees on your detector (Object 2 in this case) whose normal vector points along local +z, the filter string would be: X_NGT(2, 0.984808) When running the ray trace, you can define this filter and save the ray data: this means that only if the ray passes the filter, it will be saved:   After that, you can use this ray database for your analysis. The second possibility would be define a table coating and apply it in an object placed in front of the detector. With this coating format the transmission and reflection may be defined as a function of incident angle and wavelength. To read a compl
This thread is dedicated to the upcoming webinar: OpticStudio-OpticsBuilder Interoperability in Design of Optical Spectrometer. Any questions received during the webinar will be responded to as a reply on this thread. Feel free to post your own questions! The speaker will be notified and will respond as long as the thread is still open.Be sure to subscribe to this thread if you want to see additional discussion regarding this webinar topic. The thread will be open to new replies through Friday, June 17th. [The webinar has concluded] Webinar detailsDate: Thursday, June 9thTime: 6:00 - 7:00 AM PDT | 11:00 AM - 12:00 PM PDTPresenter: Mojtaba Falahati, Senior Application Engineer at Ansys ZemaxAbstract:Join Mojtaba Falahati, Senior Application Engineer, as he explores the optical-optomechanical design cycle for lens-grating-lens spectrometers using commercially available optical elements and describes how Zemax tools enable a joint workflow to turn optical designs into reality.Optical spec
Here's the discussion space for the OpticsTalk: Deep Dive into Creating the Wavefront from the Spot Diagram, PSF from the Wavefront and MTF from the PSF, to hosted by OpticStudio Optical Test Engineer, Michael Humphreys. Join Michael to discuss the principles of how OpticStudio calculates Wavefront, PSF, and MTF. Have questions? Post them here before the talk!
In the following article, we introduced how to build an EPE based on surface-relief grating (SRG). However, this does not work for holographic gratings.How to simulate exit pupil expander (EPE) with diffractive optics for augmented reality (AR) system in OpticStudio: part 1The current available Kogelnik model in OpticStudio cannot be used for EPE waveguide. Here we will explain why and provide a workaround DLL. Note this is based on assumptions and the result could be inaccurate.In Kogelnik’s method, it assumes, as below, that the refractive index of the hologram itself and its environment are same. Even after the hologram fringes are developed, the average refractive index is still same.However, in reality, the refractive index from the environment is different.For example, we might have a hologram, with average refractive index of n0 = 1.5, coated on a glass substrate, with refractive index of n1 = 1.7. And the other side is AIR, with refractive index of n2 = 1.0.In real situation, t
This thread is dedicated to the upcoming webinar: Laser Applications with Ansys Zemax. Any questions received during the webinar will be responded to as a reply on this thread. Feel free to post your own questions! The speaker will be notified and will respond as long as the thread is still open.Be sure to subscribe to this thread if you want to see additional discussion regarding this webinar topic. The thread will be open to new replies until Friday, May 6th. Webinar detailsRegister here: [The webinar has concluded]Date: Thursday, April 28thTime: 6:00 - 6:30 AM PDT | 11:00 - 11:30 AM PDTPresenter: Flurin Herren, Application Engineer II at Ansys ZemaxAbstract: Laser beam propagation requires unique considerations when setting up models in optical design software. OpticStudio has a wide range of tools and capabilities for modeling laser applications. This webinar demonstrates various laser applications and components that can be simulated using OpticStudio.
This thread will be used to collect questions before the webinar, and to answer any questions we received during the webinar. Feel free to post your questions! Be sure to subscribe to this thread if you want to see additional discussion regarding this topic. The thread will be open to comments until Thursday, May12th. Webinar details:[The webinar has concluded]Date: Thursday, May 5thTime: 6:00am PST & 11:00am PSTPresenters:Hui Chen, Senior Application Engineer Taylor Robertson, Senior Application EngineerAbstract: Complex optical systems require coupling simulation techniques across multiple length scales for accurate design and tolerancing. Extracting the light from nanoscale emissive structures in illumination systems, or propagating light through a mixture of guided and free space components are just a few examples.Ray tracing approximations break down near the dimension of the wavelength, and electromagnetic approaches are too expensive for larger devices. Traditional methods
This thread is dedicated to the webinar: Optotune Liquid Lenses Added to Zemax Stock Components. Any questions received during the webinar will be responded to as a reply on this thread. [The event has concluded] Webinar detailsPresenters: Mark Ventura, Vice President Sales & Marketing at Optotune Mark is electrical engineer and has co-founded Optotune in 2008. As VP Sales & Marketing he has developed the market for liquid lenses with a focus on machine vision. Michael Büeler (@Michael.Bueeler), Head of Optics Engineering & VP Quality at Optotune Michael holds a PhD in Biomedical Engineering and Optics and is responsible for optical design at Optotune. Employing Optotune’s liquid lenses has designed several focusing, zoom and illumination systems for mobile phone cameras, factory automation and med-tech applications. Abstract:In the past 10 years, liquid lenses have evolved to become a well-established solution for fast and reliable focusing. In this webinar we will discus
Sorry that this title is not accurate. The TIR occurs sometimes when we recreate something from a patent or we add extra operands other than the operands generated by the field wizard.I have been asked several times. I got this error message during my current project. So I want to share some thoughts about how to do this:Reduce the field/ numerical aperture to remove the TIR first Apply the RAID/ MXAI/MXRE/MXRI to constrain the incident angle You can optimize the lens shape to reduce the angle Try HYLD in optimization wizard Designing for as-built performance with High-Yield Optimization – Knowledgebase (zemax.com)
An interesting problem has surfaced as the Zemax team at Ansys continues to develop toolsets to aid the simulation of small form-factor, wide-angle systems such as cellphone camera lenses. Different ways exist to set up and model a stop with a lens traversing its aperture, but which is the best way? With this post, we share our recommendations and seek your feedback based on your experiences tackling this challenge.Virtual propagation modeling is one of the ways to model these stops. Technically, the portion of the lens that goes through the stop will modify the wavefront before the stop limits the wavefront with its clear aperture. Therefore, we cannot use virtual propagation to model these stops because we will be adding non-real pupil aberrations to our optical systems since we are forcing the entrance pupil to be flat when it is not.Here is an example of a stop with a lens going through it.By definition, “the aperture stop is the aperture in the system that limits the bundle of lig
Optimization is extremely important in a computer-aided design. However, it's never as easy as just a click on a button. We usually need to carefully set up the variable and merit function. While there are no a general rules to follow for all systems, there are still some useful trick to follow for most of common cases. Here I'm sharing some from my experience. If you put yours in reply, I will also update this thread.:) [Michael Cheng] 1. Keep this in mind: Always check and consider whether a variable is really required to be a variable. * If the variable won't changing system performance much, turn off it first. You can turn it on at final stage for a fine tune. A typical case is the thickness of a lens. In many cases, they contribute much less than those AIR thicknesses in the system. * Before careful that, if you have a redundant variable, for example Radius on an isolated STOP, the optimizer may be confused and just cannot work well. * If during optimization, a variable just
Here is a quick note about two frequently asked questions.How to set up the parameter Holo type and Diffract order for the surface Optically Fabricated Hologram (OFH) How to set up a reflection hologram using OFHBefore read this post, make sure you have read the following KBA. That is the background knowledge to this post:How to model holograms in OpticStudio Difference between OFH and other hologram typeFor other hologram models, we only assume the construction systems are composed by two converging or diverging source. (A collimated source is considered as case the converge/diverge point is at far point.) However, in real world, they are built with same laser with some lenses. And there the source is actually not a point but has some aberrations. Sometimes these aberrations are intentionally introduced to correct the aberrations that come in playback systems. The parameter 'Holo type'For OFH, the rule for Holo type is a little different compared to other hologram models. Normally,
To evaluate system throughput, there are 4 tools you can consider in OpticStudio Sequential mode, Footprint diagram, Vignetting Diagram, Geometric Image Analysis (GIA), and Transmission analysis. The Footprint Diagram and Vignetting Diagram deal with vignetting only and does not consider the Fresnel reflection or bulk absorption losses. They show what fraction of the launched rays are being blocked off/vignetted due to surface apertures. The GIA can do the same thing, but it can also consider Fresnel reflection and bulk absorption losses if you check the Use Polarization button. And the Transmission analysis always includes Fresnel reflection loss and bulk absorption loss. For example, in this system below, at the edge of the field 30deg, the Footprint diagram, the Vignetting Diagram, and the GIA (bottom reports “Percent efficiency %”, Use Polarization not checked) all report very similar system throughput. If you check Use Polarization in GIA, then it’ll consider the Fresnel reflectio
For a pancake type of VR, most of the straylight issue comes from the imperfect polarizer model. Here we have provided an example that users can play with for investigation.The components in the system are as below.In the polarizer (object #6), users can control the light leakage at reflection and transmission side. The polarizer is based on this paper but with some modifications: Non-paraxial idealized polarizer modelNote the input light is with circular polarization. Enjoy!
Speckle is created by the interference of multiple, mutually coherent optical fields that have random phases at any point on a detection surface. This could be associated with laser light scattering from a rough surface or from the interference of multiple modes emanating from a multimode fiber. For multimode fibers, it may be best to use POP in a sequential model in which the beam leaving the fiber is modeled as a coherent superposition of LP (or LG) modes. Alternatively, we show here how a simple non-sequential ray-based model can be used to simulate speckle arising from rough-surface scattering. To model light leaving a rough surface, we can employ Huygens’ Principle and use a set of point sources that are randomly phased over (0,2pi). This assumes the height variations on the surface are large enough to create such a random process. Ideally, we would use a single non-sequential source that emits rays uniformly over some spatial region and within a defined cone angle. However,
Here's the discussion space for the OpticsTalk: Beam steering approaches for automotive & consumer LiDARS: Lidar is considered the primary sensor for depth perception for various applications from self-driving cars to the latest Apple iPhone. Beam steering is often the defining component of a lidar system, determining performance, size and reliability. In this talk, we will review the trade-offs that must be considered while developing a lidar-system for both mechanical and solid-state beam steering approaches. Lidar is an application specific sensor necessitating to satisfy vastly different requirements for both automotive and consumer markets. In this context, we will discuss Lumotive’s lidar system which employs a CMOS based liquid crystal metasurface (LCM) to holographically steer light over a wide field of view while maintaining a large aperture. The metasurface consists of nano-scale optical antennas to give sub-wavelength control of the phase, amplitude
In this thread, I introduce a user-extension DetectorToTIFF to save Detector Viewers as TIFF images. Presently, the user-extension supports Detector Rectangle, which are saved as monochrome 8-, or 16-bits TIFF images. This user-extension is meant to complement the Output Image File field of the Detector Viewer's settings. Version Current version: 0.7   This extension has been developed and tested on OpticStudio 19.4 SP2. The latest release, v0.8, can be found in my latest reply to this thread. Installation To install the user-extension, download the archive DetectorToTIFF.ZIP and extract it on your computer. This archive contains a local version of this thread and the user-extension DetectorToTIFF.EXE, which you must copy to your \Documents\Zemax\ZOS-API\Extensions folder of OpticStudio. Use To use the user-extension, open a non-sequential file, such as User aperture sample.zmx from the \Documents\Zemax\Samples\Non-sequential\Miscellaneous folder of OpticStudio. Then, run a ray t
This thread is dedicated to the upcoming webinar: Modeling a Lidar System in OpticStudio: Characterizing Range for Lidar Systems. Any questions received during the webinar will be responded to as a reply on this thread. Feel free to post your own questions! The speaker will be notified and will respond as long as the thread is still open.Be sure to subscribe to this thread if you want to see additional discussion regarding this webinar topic. The thread will be open to new replies for a limited time following the event. [The webinar has concluded] Webinar detailsDate: Thursday, October 27thTime: 6:00 - 6:45 AM PDT | 11:00 - 11:45 AM PDTPresenter: Angel Morales, Senior Application EngineerAbstract:Modeling a Lidar in OpticStudio: Characterizing Range for Lidar Systems. For lidar systems, a key specification is the range at which the lidar can detect a positive return signal. This characteristic is determined by several factors, such as the energy contained in the light pulse sent by the
In Q3 2021, Zemax partnered with the Optica Foundation to identify and support exceptional optics-focused students in emerging economies. I’m very excited to have these individuals join our ranks within the Zemax user community, and I hope that everyone will join me in welcoming and mentoring them.Samuel ( @sizapatav23 ) - Jesus ( @cansecodiaz ) - Naresh ( @Naresh.Kumar ) - Anda-Maria ( @Maria T ) - Adewale ( @adewale ) - Melvin ( @Melvin.James ) For the students, please reply on this thread to tell the community a bit about yourself! I think we’d all like to know…Degree you’re pursuing Your academic institution What are your passions within the field of optics? What do you hope to study or learn with your OpticStudio software? Is there anything you’d like to ask of the community members?
This forum thread should be used to continue the discussion from the Envision 2020 workshop, Designing for Manufacturability. Presenter: Katsumoto Ikeda Abstract: Careless approaches to optical design can result in designs that are sensitive to manufacturing and alignment errors, resulting in optical products that are difficult to manufacture successfully repeatedly. Methods for desensitizing the optical system, including and High-Yield Optimization, produce designs that meet tight performance specifications, provide a higher manufacturing yield, and lower manufacturing costs through less waste. Join our workshop to discover our best practices in using these methods. You can also find the Envision 2020 LinkedIn group here.
This thread is dedicated to the upcoming webinar: Designing Cell phone Camera Lenses with an Interoperability Workflow – Part 2. Any questions received during the webinar will be responded to as a reply on this thread. Feel free to post your own questions! The speaker will be notified and will respond as long as the thread is still open.Be sure to subscribe to this thread if you want to see additional discussion regarding this webinar topic. The thread will be open to new replies for a limited time following the event. [The event has concluded] Webinar detailsDate: Thursday, September 22ndTime: 6:00 - 6:45 AM PDT | 11:00 - 11:45 AM PDTPresenter: Esteban Carbajal, Senior Product ManagerAbstract:Cellphone camera designs are required to meet evermore stringent performance specifications to compete in the field. Laboratory testing typically occurs late in the manufacturing phase where any previous errors in estimates that are found will cause significant schedule delays and cost increases.
I could not renew my ZEMAX support contract because I am self-employed.My licence is a perpetual licence, which means that I can continue to use ZEMAX only in its current version.I will continue to have questions about how to use ZEMAX.I am looking for a place to discuss ZEMAX after I no longer have access to this community and for people to personally advise me on ZEMAX.If you know of anyone who might be able to help, I would be grateful if you could respond. My email address is. firstname.lastname@example.orgKIDA Hideki
This forum thread should be used to continue the discussion from the Envision 2020 workshop, Coordinate Breaks. Presenter: Angel Morales Abstract: Coordinate Breaks are instrumental when modeling off-axis and other non-rotationally symmetric systems. However, specifying parameters like arbitrary pivot points in 3D space is non-trivial in a sequential coordinate system. In this session, we'll take a closer look at setting up Coordinate Break surfaces and utilizing functions like the Coordinate Return and different Solve types to more efficiently set up complex systems. You can also find the Envision 2020 LinkedIn group here.
The polychromatic RMS wavefront error is calculated based on the following equation: RMS polychromatic = sqrt(sum ((optical path difference for each ray of the pupil for each wavelength)^2 x weight of the ray)/ sum(weight of the ray)). This means that the polychromatic RMS wavefront error is calculated for all wavelengths at the same time and for all the pupil. For polychromatic results, the reference is the primary wavelength, so the wavefront error is calculated for each wavelength at the same time but with the reference set to the primary wavelength for each wavelength used. This means that for each wavelength, the rays are first propagated until the image plane, then the optical path length of each ray is recorded, then the rays are propagated back to the reference sphere defined by the chief ray at the primary wavelength, then path length of each ray is recorded again during this back-propagation, and finally the back-propagation path length is subtracted from&nb
Here's the discussion space for the OpticsTalk: The Strengths & Applications of Various Freeform Surface Representations hosted by Zemax Computational Physicist, Radu Miron. Let's join Radu to learn more and discuss about Freeform surfaces in Zemax Products Stay tuned on this forum thread for updates and discussion after the talk and check out our other OpticsTalks!
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.