In this example, we introduce a simulation workflow to analyze the optical performance of a monochrome AR (Augmented Reality) system from a combination of optical system and one-dimensional gratings within specific lighting conditions. The optical lens system designed with Ansys Zemax OpticStudio and the gratings designed with Ansys Lumerical, are exported to Ansys Speos for system level analysis. This example mainly covers Ansys Speos part of the overall workflow.
Overview
Understand the simulation workflow and key results
Augmented Reality (AR) is a technology that combines the virtual world on the screen with real-world scenes. This article demonstrates how to use Ansys's complete optical solution to design and analyze an AR system formed by input/output and exit pupil expander (EPE) diffraction gratings; we import the optical lens system information from Zemax OpticStudio and optical grating information from Lumerical into Speos to do System-level performance analysis of these systems, as they will need to operate in the real-world with human perception. This interoperability workflow captures the interplay between the nanoscale structure of the gratings and the macroscale structure of the projection lens when using Speos to simulate the entire AR optical system within a 3D environment. Users can optimize the components and construct an accurate perception of the 3D scene with lifelike illumination and photometric/radiometric physically unbiased rendering.
There are three main tools needed for this virtual solution:
- Zemax OpticStudio to design the projection lens and export the optical system from Zemax to Speos using "Export Optical Design to Speos" (.odx) feature
- Lumerical RCWA (Rigorous Coupling-Wave Analysis) or FDTD (Finite Difference Time Domain) to model the diffraction grating surfaces that scatter light to the supported grating orders and export the grating surface model using json file
-
Speos for in-depth system-level analysis by seamlessly and accurately integrating the projection lens model via Optical Design Exchange (.odx) capability, along with incorporating subwavelength diffraction grating surfaces for couplers via json file. Utilize GPU/CPU computation for accurate non sequential ray tracing analysis and generation of photometric/radiometric outcomes, including spectral irradiance and radiance maps. Validate the AR system's performance by factoring in human perception for comprehensive analysis.
This application requires Lumerical, Zemax OpticStudio and Speos simulation, but in this example, we focus on the steps in Lumerical and Speos, assuming that a Lens model for the projection lens system is already available.
Exit pupil expansion is one of the common techniques used in waveguide-based AR system. In the example shown below, the light from a display source is projected into a projection lens which converges the light into the In-coupling grating of the AR system. Then the light gets diffracted and propagates through the waveguide by the total internal reflection (TIR) phenomenon. After light reaches the expander grating in the system, it gets folded by 90 degrees and expanded toward the last grating in the system which couples out the light from the waveguide to the eye.
In this example, we use a simple eye piece lens with some optimization for the purpose of demonstration. The requirements of the light engine for this AR waveguide system have many similarities to the eye piece, where both have their pupils outside of the system. In general, the lens component for an AR system can be more complicated to achieve high image quality and meet other system requirements. It can also depend on the light source type. For example, if the light source is a laser diode, then reflecting mirrors might be used instead of lenses.
Step 1: Lens System Design with Zemax OpticStudio (not covered in this article)
In this step, a lens system is designed in Zemax OpticStudio and exported in a readable file format for Speos (*.odx) for system-level analysis. The *.odx file includes the geometries of the system with their position and orientation, the position and orientation of the imager, the materials and coatings used in the lens system.
We create a simple lens system for the input of the image for the AR waveguide. We simply pick up a triplet eye piece from the Zemax design template and reoptimize it to some level for the purpose of this AR waveguide.
Step 2: Grating Design in Lumerical (not covered in this article)
The waveguide-based AR system in this example relies on the diffraction gratings to control the propagation of the beam in the waveguides. The periodic wavelength-scale structures of the gratings are simulated using the RCWA solver. The diffraction properties of the in-coupling, out-coupling and expander gratings are saved in a JSON data file, which is imported into Speos as a surface property to model the properties of the subwavelength structures in a ray tracing simulation.
The following requirements on the in-/out-coupling and expander gratings are used to decide on their appropriate shapes and periods:
- Target wavelength: 520 nm
- Grating material: glass
- Transmissive grating for in-/out-coupling with high 1st-order efficiency
- Reflective grating for folding/expanding with high 0th-order diffraction efficiency
Step 3: Speos Analysis
In Speos 2024R1 and above user can import and position the lens model from Zemax using the optical design exchange feature and run ray tracing analysis with GPU or CPU. The .odx file serves as a container storing information about the lens design, spectral material and coating properties, stop surface, and sensors. For more information about the content, supported surface, and object types, refer to the link.
The grating parameter files (JSON) can be imported into Speos as a surface property to model the properties of the subwavelength diffractive optical elements of the AR system. In Speos, we run a ray tracing photometric simulation, and explore how rays are interacting with the waveguide-based AR system and extract the key human perception metrics from the spectral radiance map. In addition, visualization of the optical system from several user-defined points of view using an observer sensor allows to explore human eye perception with realistic illumination conditions at multiple viewpoints.
Run and Results
Instructions for running the model and discussion of key results
To explore the waveguide-based AR system optical performance, we tested the system with different light sources including ambient illumination and display source.
Step 1: Zemax OpticStudio Simulation
The lens system is designed in Zemax OpticStudio and exported as an odx file; we already opened the AR_Projector_end.zar file from ZOS_Projector_Lens folder in Ansys Zemax OpticStudio24R1 and used the "Export Optical Design to Speos" from the Export group under the Files tab. The generated .odx file is saved in the ZOS_Projector_Lens folder.
Note that, both sequential and non-sequential components can be exported in the odx format. The components including a Zemax Black Box cannot be exported.
In case you don’t have access to Zemax OpticStudio 24R1, or your lens system is not supported by this feature in Zemax OpticStudio you can still export the lens system as a STEP file in Zemax OpticStudio and insert it into Speos and apply optical properties which can be used for more complex lens system.
Step 2: Lumerical Simulation - RCWA
The photonic simulation of the subwavelength grating structure imager is done in RCWA, and the resulting information is exported as a JSON file which fully characterizes the structure for all angles of incidence and wavelengths to be used for systematic level study with SPEOS. A script-based approach is used to calculate and export the surface properties from Lumerical tools. More information about Lumerical Sub-Wavelength Model (LSWM) can be found in Speos Lumerical Sub-wavelength Model – Ansys Optics .
Step 3: Speos Simulation
In Speos 2024 R1, open Speos simulation file AR Stereo system 520nm.scdocx from the downloaded file START-Augmented-reality-optical-system folder. As you can see in the speos project, it includes the AR system waveguide, the imported lens model and predefined sensors, sources to speed up the simulation setup. In order to import and position the lens model in Speos using the optical design exchange feature, which is GPU compatible, the following steps are already taken.
-
We Clicked on the “Optical Design Exchange" tool in the Components group located under the “Light Simulation” ribbon in Speos.
A new "Optical Design Exchange.1" feature has been created in the Simulation panel.
- As meshing quality of the imported lenses affects the simulation performance and the quality of the produced results, we define a specific Meshing directly on the Optical Design Exchange feature through the options by right clicking on the Optical design Echange.1 feature in the simulation tree and the following meshing values for optical imaging systems is set.
Additional information about the meshing can be found in the “Important Speos Model Settings” section.
- We located the lens system at the InCoupling.Origin axis, from the definition panel and browse for the "*.odx" file and clicked on "Compute" to launch the import.
- As a next step we added the ODX optical design exchange.1 from simulation tree to the predefined simulations.
Note: According to the goal of your project, you can start designing the opto-mechanical parts in Speos or import opto-mechanical parts designed in other CAD platforms and/or modify the lenses geometry, on ODX using SpaceClaim functionalities (add lens edges, modify the diameter aperture of a stop surface, …). You need to keep in mind that ODX is a container that links the SpaceClaim geometries to the optical properties applied on the different lenses and surfaces.
Importing Surface Properties to Speos
-
To define the grating in Speos, click on the material in simulation panel.
- In the Gratings FDTD material, click on the In Coupling Grating and edit Surface layer #1
- In Definition panel, in Surface properties, Type should be Plugin , File should be the lumerical-sub-wavelength-2024R1-GPU.sop file from Speos input folder and Parameters should be the JSON file having the Grating data (in this example we used RCWA_green_in_out_export.json ).
- In Texture image, Type should be From File and File should be a plain white image (all the required files are included in Speos Input files folder)
Orienting the Surface Properties in Speos
By default, the LSWM is z-normal to the surface it was applied to and is oriented according to the global assembly reference. It is possible to apply a local reference, for example to rotate a grating around the normal of the surface, with UV mapping.
- To edit UV mapping of #1 grating, click on Grating01 Orientation in In-coupling UV mapping from structure tree. Modify the origin, projection direction (Z axis) and top direction (Y axis) from Origin System.
Note :
- In this example, anisotropy is oriented on geometries as follows: vertical orientation for In-coupling grating, 45° orientation for expander grating and horizontal orientation for Out-coupling grating property.
- In order to consider the spatial variation of the gratings in Expander Grating, a gradient efficiency is applied. To apply gradient efficiency property, another surface layer is added in Expander Grating with optical polished property and specific texture and orientation. This texture acts as a mask with all or nothing transparency in the texture or a probability with semi transparency in the texture. Here the texture is white with variation on the Alpha channel to drive the transparency, such that white means 100% of efficiency of the next layer which is the grating layer and transparent is 0% efficiency gradient.
Ray Behavior Analysis
A grating can diffract light into several beams travelling in different directions with specific efficiencies. To explore the effect of a grating on the light propagation, the user can use interactive simulation which allows to visualize the ray’s propagation from the light source through the optical system.
- Once the system is set, compute the interactive simulations, Interactive Parallel Beam 520nm and Interactive Display Source 520nm. The interactive simulation with parallel source results generates light rays coming from a monochrome source with zero angle of emission through the above-mentioned waveguide-based AR optical system. While the interactive simulation with Display source shows the ray behaviour emitting from a display, considering its physical properties such as intensity distribution through the optical system.
Analysis of Irradiance and Illuminance Distribution
User can gather rays and analyze the homogeneity via Irradiance sensor which allows to compute irradiance (in W/m2) or illuminance (in Lux) of the light.
- Right click on the Direct.Uniform_Display_Source_520nm simulation from simulation tree and select GPU compute or Compute for CPU compute. Using this simulation result, user can explore the irradiance coming from a 520 nm uniform display source at the output-coupler area of the waveguide. Once the simulation is completed, double-click on the XMP result to open the irradiance map in the Virtual Photometric Lab and check the uniformity.
- Inside the Virtual Photometric Lab, change the True Color to Black to white (color) to review the illuminance values at different positions. Change the All wavelength to 520 nm to explore the illuminance values at 520 nm.
- Click on the Level tool to see and adjust the maximum value of the legend bar for the illuminance; use the Measures tool and modify the area of interest to measure the average illuminance value on the desired area.
Note: If you want to select a region of interest in the irradiance sensor plan and navigate the associated ray path in the Speos 3D environment, make sure to enable Light Expert feature in Direct.Uniform_Display_Source_520nm simulation and select the LXP from the corresponding sensor then use Compute to Run simulation.
Then from the generated results of Direct.Uniform_Display_Source_520nm, click on the *.LPF results and use Measures tool to navigate the associated ray path from sensor thought the optical element in the Speos 3D environment,
Assess the Final Image from Display and Perceived Distortion
In order to achieve a better sense of the perceived final image and assess the distortion, a radiance sensor can be used, which allows to collect light and compute radiance (in W/(m2.sr)) and luminance (in cd/m2) from an observer point of view with specific FOV.
- Open the "Speos Results file" folder and explore the already computed results. To explore the radiance map captured from Display, you can check the results of Inverse FDTD 520nm simulation>Inverse FDTD 520nm.Radiance FHD 520nm.xmp result from Simulation tree in Speos Simulation or directly from pre-computed stored data in the "Speos Results file">"Speos output file" folder using Xmp viewer. Then click on the Measures tool and modify the area of interest, to show the actual luminance and colorimetry from the display source on the desired area of the sensor.
Note : Inverse simulation allows user to propagate a large number of rays from a sensor to the sources through an optical system. This is useful when needing to analyze optical systems such as the above-mentioned system where the sensors are small compared to the illuminated geometry or when lights are diffusing.
To quantify the contrast sensitivity of the system, radiance simulation with both Environment and Display sensor allows to overlay information onto a user's visual field. The overlay can be done with the Photometric Lab Cal tool, which combines (Map Union) the *.xmp results from Environment and Display sources
Note : Environment light source is based on an HDRI image where the device is observed in.
- From the "Speos Results files">"Combined Radiance" folder, open the Inverse.Radiance FHD_Combine.xmp . The file will be opened with Virtual photometric lab. Use the Virtual Lighting Controller Parameters on the results to assess the contribution of the sources of the current configuration independently and adjust the corresponding power or ratio values.
Note: To analyze the contribution of light sources without running a new simulation, the Virtual Lighting Controller allows to drive the level of ambient light and display independently and do analysis on multiple ambient light conditions from a single simulation.
Assess the Variation in Distortion and Homogeneity Around the Nominal Position
For wearable optical systems, the designer must accommodate for variances in human eye and usage, therefore it is crucial to explore AR system performance at different observer viewpoints. To visualize the optical system from multiple user-defined points of view in a more immersive condition, an observer sensor is used; and the corresponding observer results from an Environment Light source are combined with the ones from the Display Light source.
- From the Simulation tree, open the Inverse FDTD Observer results; Inverse FDTD Observer Ambient.Observer FHD.speos360 and Inverse FDTD Observer 520nm.Observer FHD 520nm.speos360 and navigate through different viewpoints of the observer to explore the results with ambient source and display source, respectively. To assess the contrast variation in radiance results, user needs to use union operation from Creation tool in VR-Lab viewer and combine the above-mentioned results which the precomputed result can be found in "Speos Results files"> "Combined Observer" folder.
Human Eye Perception
In the case of AR system, the human eye is the ultimate sensor. The eye assesses the visibility of things, the color of things, and the entire world surroundings. The eye abides too bright luminance, too dark details, and poor homogeneity. With Human Vision Lab, user can consider the eye biological characteristics and predict light behavior as it is perceived by the human eye.
- From the "Speos Results files">"Combined Radiance" folder, open the Inverse.Radiance FHD_Combine.xmp with Human Vision Lab and modify the Vision parameters .
With Vision Parameters tool , user can modify Adaptation type to explore Dynamic or Local adaptation. With Dynamic adaptation, it is possible to model the spatial adaptation of the human eye, for example if the eye is coming from a dark room.
User can also change the age of the observer, the glare effect, eye deficiencies and modifying the condition of observation of the scene.
Note : The glare is the contrast lowering effect of stray light in a visual scene, which is due to the fact that light sources located in periphery of the visual field are diffused by various diopters constituting the human eye, which could result in darkening the central zone of vision, the fovea. Glare forms a veil of luminance which reduces the contrast and therefore the visibility of a target is decreased.
Assess Legibility/Visibility
Using Human Vision lab , user can also accurately assess the readability of the information displayed according to its contrast when used with background. In addition, with the Visibility tool, user can figure out whether the message is visible.
For legibility, the technique is to run a simulation with Display source and Environment source, then measure the letter luminance, the background luminance around, and the angular size of the letter.
- Open the Inverse.Radiance FHD_Combine.xmp with Human vision Lab and select the Legibility/Visibility tools from Tool menu. Set the corresponding parameter to get the relative visual performance of an observer according to Letter and background luminance values, size of letter and age of observer, etc. Human Vision Lab computes the Relative Visual Performance (RVP) over age, which is a CIE standard that gives a perceived contrast over age. Let us assume the target user is 55 years old, and check if the letter selected in a luminance map can be read.
With legibility tool, we can explore if we need to lower the background luminance or increase the letter luminance or the letter size to meet the recommended contrast in term of RVP.
- Another study is the visibility, which user can evaluate the visibility of an object. Using the Visibility Tool, select the target and background luminance values, adaptation parameters and age of observer to determine, under the specified contrast, if the target is visible.
Important Model Settings
Description of important objects and settings used in this model
Zemax Lens System
The main specs we look for regarding the projection lens system are:
- Entrance Pupil Diameter is 5 mm. This should match the size of the in-coupling diffraction grating on the waveguide.
- Full field of view (FOV) is 40 degrees. This depends on the AR waveguide design. It is especially linked to the k space arrangement. Usually, the FOV is simply determined from the AR waveguide spec. The lens system just follows it.
- Total track from the entrance pupil to the image plane is 22 mm. For this spec, the smaller the better, but we need to compromise it with image quality.
- Image quality; In this example, we simply optimize for small spot size for each of the field. We did not request distortion. This is good enough for the purpose of the demonstration, but the actual AR system can have more spec requirements.
- In this example, we have the Display source close to the lens system, meaning that it is unlikely a LCoS or MEMS system, where a beam splitter cube is often needed in between the image display and the lenses. In this case, we do not have any assumption, but this can be a micro-LED display with high luminance.
Lumerical Grating Design
The following requirements on the in-/out-coupling and expander gratings are used to decide on their appropriate shapes and periods.
- Target wavelength: 520 nm
- Grating material: glass
- Transmissive grating for in-/out-coupling with high 1st-order efficiency
- Reflective grating for folding/expanding with high 0th-order diffraction efficiency
Speos Local Meshing
Sag value set to fixed 1µm on imaging lenses, since it is a curved structure and needs to be finely meshed to accurately model the lens performance.
The rest of the parts, which are flat, do not require fine meshing. Further details about mesh settings can be found in Meshing Properties.
Updating the Model with Your Parameters
Instructions for updating the model based on your device parameters
Speos Simulation Parameter
In this example, we only included head CAD as a geometry in simulation with Environment source to block most of ambient light coming from back of the head. To do more advanced analysis, user can include head CAD as a geometry in simulation with Display source and consider possible back reflection from the observer eye.
Note: To simulate multiple viewpoints in single shot, it is recommended to use high computing power such as HPC (High Performance Computing) or cloud. Using 10 passes is enough to get correct aspect, but for noise-free results 100 pass are recommended.
Speos Material Parameters
In this example, ideal antireflective property is considered for lens system. To visualize the origin of stray light directly within AR geometry, and see the effects of secondary sequences stray light, face property can be replaced by optical polished, which can be used for perfectly polished material such as glasses. In addition, the border of waveguide is defined as 100% absorbent; to study the reflection form border of waveguide, the corresponding material property can be updated.
For stray light analysis within AR geometry simulation, user can enable Light Expert option in simulation, explore the LPF generated results, and navigate the associated ray path in the Speos 3D environment via Measures tool.
Note : Radiance simulation might take couple of hours if you want to have final render including a stray light image.
Taking the Model Further
Information and tips for users that want to further customize the model
As mentioned before, in this example, we used 1-D grating for DOES (Diffractive Optical Elements). To take the mode further user can replace the grating with their own 1-D or even 2-D grating to diffract lights in one or two directions.
Also, to consider the true perception of the eye according to the set depth of field, the radiance sensor can be replaced with human eye sensor and further optimizations can then be done to adjust the performance. Below, an example of AR system perceived from the eye at several focus distances of the eye is shown on left, while the corresponding eye accommodation is shown on the right.
Additional Resources
Additional documentation, examples and training material
- How to simulate exit pupil expander (EPE) with diffractive optics
- Dynamic workflow between Lumerical RCWA and Zemax OpticStudio
- How the grating projection in FDTD works
- Grating projection script (FDTD)
- Farfield projection of periodic structures
- Speos Lumerical Sub-wavelength Model
- Ansys Lumerical Model Helps Bring Sub-Wavelength Gratings to Visible, Human-Scale Applications in Ansys Speos
- Ansys optics solutions at the heart of optics innovation in AR/VR systems!
- Optics & VR | Advanced Optical Simulation - Ansys Speos Light Path Finder (sapjam.com)
- Optics & VR | Visualization - Ansys Speos Human Vision (sapjam.com)