This example is an extension of the demo system introduced in Augmented Reality Optical System. The model is expanded to a 3-channel waveguide subsystem, with an optimized achromatic projection lens and optimized in-coupler waveguide grating.
This article explores optical design and analysis techniques within the Ansys optics solutions which are essential for designing and validating RGB AR systems. Key features covered include Optical Design Exchange (.odx) for data handling between Zemax OpticStudio and Speos, Live Preview, Virtual Lighting Controller, Virtual Lighting Animation, and GPU acceleration.
Overview
Understand the simulation workflow and key results
Modern wearable Augmented Reality (AR) technology is typically designed as a glasses or goggles type form-factor display device which does not disrupt the wearer’s nominal view of the environment, allowing simultaneous viewing of the real world and projected ‘hologram’ (virtual) images. Although there are currently several different architectures being used for the display optics technology, this article focuses on diffraction grating based waveguide technology. The demonstrations in this article use multiple waveguides each with diffraction gratings tuned to an individual color channel.
Utilizing the concepts of optical subsystem design and optimization introduced in Augmented Reality Optical System, a model representing an RGB display architecture is used as a vehicle to introduce additional integration and analysis tools available in the Ansys optical software environment. These tools and techniques enhance the design workflow and extend optical analysis capabilities.
This application requires version 2024R1 or higher of Lumerical, Zemax OpticStudio, and Speos, but in this example we focus on the steps in Zemax and Speos, assuming that the waveguide grating data is already available. Note that the attached Speos demo data is compatible with Speos 2024R2 or higher.
Run and Results
Instructions for running the model and discussion of key results
Section 1: Integrated RGB Display System Overview
The integrated optomechanical AR display system consists of a color microdisplay positioned at the image plane of a projection lens (designed in Zemax OpticStudio) whose pupil is co-located with the RGB waveguide in-coupler grating surface. The waveguide comprises three high-index waveguides, each with diffraction gratings whose efficiencies are tuned for each RGB color channel (although there is leakage/crosstalk in the coupling system). Each waveguide employs three linear gratings: an in-coupler near the projection lens pupil, a pupil expander near the brow, and an out-coupler in front of the eye.
Here, Interactive Simulation allows the user to view ray paths traced through the optical system. The three-channel waveguide design is visible from the traced ray pattern.
Optimization of the pupil expander and out-coupler grating primarily impacts eye box size and color uniformity, while the in-coupler design mainly drives overall efficiency. Please refer to the linked resources [1, 2, 3] for more information about grating/waveguide optimization methods using Ansys Zemax OpticStudio, Ansys Lumerical, and Ansys OptiSLang. For demonstration purposes, in place of a fully optimized set of gratings, this system uses an optimized in-coupler grating and idealized masks to define the relative diffraction efficiency across spatial coordinates of the expander and out-coupler gratings.
The micro display is modeled using Speos's Display Source model, enabling simple definition of an input image to be projected through the system for analysis. Results displayed in this article are primarily produced via a Radiance Sensor co-positioned at the eye location to represent human vision, although external sensors or other sensor types are also used for system validation.
Finally, note that as of version 2024R1 both the ODX feature and the Lumerical Sub-Wavelength Model plugin fully support GPU compute within Speos. This advancement is of critical importance for efficient analysis of AR systems, which tend to have lower efficiency in terms of ray-tracing statistics due to the multitude of diffraction interfaces between the source and viewer. Speos’s GPU compatibility with this system enables acceleration of ray trace durations by 10s or 100s of times, depending on the hardware, optomechanical system, and ray trace settings.
intel i7-8850H | NVIDIA RTX A6000 | |
Rays/Hour | 4.64e8 | 3.02e10 |
Ratio | - | 65x |
Shown here: Ray trace speed test comparison between CPU and GPU. The result shown was generated using the AR display model as a test system, simulating radiance measured in inverse simulation from the eye position through the display.
Section 2: Lens Data Transfer via Optical Design Exchange (ODX)
Supporting multiple color channels in the display system requires achromatic projection optics. The lens presented here was designed and optimized in Ansys Zemax OpticStudio, balancing FOV, f/#, and size requirements, along with image quality including color correction, relative illumination, and distortion. For more information regarding lens design techniques and capabilities in Zemax OpticStudio, please visit the Zemax Knowledgebase or Ansys Learning Hub.
Because our system integration and analysis will be performed in Ansys Speos, the lens design needs to be imported from Zemax OpticStudio. This can be performed via CAD transfer, but ensuring accurate representation of optical properties can be cumbersome and time consuming. The Optical Design Exchange (ODX) file transfer tools solve this problem. An .odx file can be generated from within Zemax with a single click, exporting the lens geometries as well as a full description of their optical properties. Importing this file into Speos, the lens, stop aperture, and image plane geometry, as well as the lens materials and coatings are automatically defined and applied appropriately to the elements/surfaces.
Shown here: the ODX tools in Zemax OpticStudio (left) and Speos (right) which streamline the lens design transfer process.
Steps to transfer lens data from Zemax to Speos:
-
- In Zemax OpticStudio version 2024R1 or higher, open the lens file (<downloaded data>\ZemaxModel\ projector_waveguide_v2_5E5P-2asph_v2.zar).
- In the File tab, click the “Export Optical Design to Speos” button. Choose a save location for the resulting .odx file.
- In Speos v2024R2 or higher, open the waveguide model (<downloaded data>\SpeosModel_Start\AR Stereo system 2024R2 RGB_start.scdocx)
- In the Light Simulation tab > Components group, select the “Optical Design Exchange” button.
- Select the coordinates and axes of the ODX object by clicking the coordinate axes icon (top-left of CAD window), and then selecting the coordinate origin “ODX_Origin_R” from the structure tree.
- In the ODX object settings, under “Optical Component”, choose the .odx file to import by opening the dropdown list and browsing for the .odx file generated in step2.
- Click the “run” button in the top-left corner of the CAD window to complete the ODX import.
- To run the simulation "Collimated Right", the newly generated ODX file must be added in the simulation geometries list.
- Double-click the simulation “Collimated Right”, and select the geometries icon in the upper-left corner of the CAD window.
- Click on the newly created ODX object (by default, “Optical Design Exchange.1”) from the Simulation tree.
- Holding the CTRL key, press the green checkmark in the upper-left corner of the CAD window to add this selection to the simulation geometries.
- Now, the simulation can be run. Other simulations can be added and run as well; please see the "..._end" demo file which contains some additional simulations already set up.
Section 3: Advanced Analysis Techniques
Examples of System Level Design Analysis Tools and Techniques
Display Contrast Across Ambient Conditions
In AR devices intended for outdoor use, validation of display brightness is critical. In sunny conditions, solar illuminance can reach 100klux or more; if the display brightness and/or system efficiency is insufficient, the displayed image will become washed out by the ambient light. The Virtual Lighting Controller and Virtual Lighting Animation tools in Speos can be utilized to efficiently analyze display visibility across different ambient conditions. Furthermore, the Human Vision Lab can be used to automatically calibrate the displayed image luminance to match human eye adaptation based on the environmental brightness.
To perform this analysis, the optical system is first embedded within an ambient environment. A simple method to generate an environment in Speos is to use an Environmental Source; a hemispherical image (.exr/.hdr) can be provided and used as a background scene, whose brightness is adjustable as a scalar value. Alternatively, as is demonstrated here, the optomechanical system can be imported into a 3d geometric scene; in this case, the Natural Light ambient source model is used, an atmospheric source model whose brightness is controlled by several parameters including date/time and geographic location.
In this example, three Natural Light source model configurations are used in a single simulation, representing dawn, morning, and afternoon conditions for a given date and geographic location; the corresponding atmospheric illuminance on the ground is 1klux, 10klux, and 100klux respectively. In the analysis sensor’s Layer setting the “Source” option is selected, which enables viewing of the contributions from each atmospheric source model independently after the simulation is completed. With this configuration, all three ambient conditions can be modeled simultaneously with a single inverse ray trace.
After completion, the results can be viewed in radiance units from the Virtual Photometric Lab; by evaluating max display brightness this way, we could infer performance under different ambient conditions. Alternatively, viewing the results in the Human Vision Lab adapts the radiance response based on the non-linearity of the human eye, and also grants an option to automatically adapt the eye based on the ambient brightness. In either case, using the embedded Virtual Lighting Controller tools allows the user to toggle contributions from individual source models on/off, or even scale their contributions to the displayed result. In this way, we can view the independent results for the three ambient conditions. As can be seen below, for this model the fixed-brightness display contrast is significantly degraded in bright daylight conditions.
Shown here: fixed display brightness against a varying environmental ambient brightness. From left to right, a 630am, 7am, and 3pm ambient radiance model illuminates the scene. The result is shown in Human Vision viewer, with automatic eye adaptation.
Finally, the Virtual Lighting Animation allows the user to export a video by controlling the relative contribution of each light source in the scene frame-by-frame, with an easy-to-use spreadsheet-based definition. For example, here the display brightness is maintained while cycling through the three environmental sources. Human Vision functionality is also supported within Virtual Lighting Animation.
Shown here: fixed display brightness against a varying environmental ambient brightness, as a video generated using the Speos Virtual Lighting Animation lab.
As we can see with this result, in 100klux the display contrast becomes very low due to the bright ambient conditions, which may be a critical result for the AR system design.
Interact with the simulation during GPU ray tracing
For a further level of dynamic interaction with analysis results, Speos’s Live Preview feature allows us to view results and interact with the simulation during GPU ray tracing. This functionality enables an entirely new workflow, in which the user maintains an active view of the system results while making adjustments to optical parameters of sources, materials, etc. This improves efficiency by reducing wasted time and compute resources, particularly for meshing, pre-simulation, and trial-and-error.
Shown here: functionality of Live Preview, using the RGB augmented reality device model with an indoors background image scene.
The video above demonstrates the functionality of Live Preview opening and the early results of the live ray trace. Once the mesh/properties initialization is complete, the window showing the live GPU ray trace result is opened. Live Preview allows dynamic adjustment of view rendering settings, as well as switching between radiance and Human Vision rendering. The user can also switch between detectors, and even move active detector viewing positions in 3d space while the Live Preview tool remains active. Optical properties of materials can also be changed and the Live Preview updated, without the need to re-mesh the system.
In addition, similar functionalities to those previously shown in the Virtual Lighting Controller are also available from within Live Preview. The panel on the left side shows the available sensor and source objects. As demonstrated, first the Ansys logo source image is disabled and an airplane image is enabled, and we can see the live ray trace result respond in real time. The background sources can similarly be changed. Additionally, similarly to the Virtual Lighting Controller, the brightness of individual source models can be arbitrarily scaled in the result. And once again, the Live Preview result can be rendered using radiance-units or using the Human Vision radiance response.
Shown here: virtual lighting control functionality of Live Preview; sources can be toggled on/off, and their relative power can be scaled dynamically from within the Live Preview window.
Human Eye Model
The Radiance Sensor co-positioned with the human eye, combined with the Human Vision Viewer, allows us to represent the simulated system model as if observed by a human observer. However, being based on post-processing, there are some physiological effects that are not preserved in this configuration, particularly surrounding depth of field and distortion. The Human Eye Sensor addresses these issues by including an approximation of the human eye optics in the sensor model itself, rather than via post-processing.
Shown here: Human Eye Sensor results with accommodation distance swept from 10cm to 15m. Since the projected hologram's image is collimated, as the eye’s focus moves outwards the hologram becomes sharper, whereas the near-field objects (glasses frames, etc.) move out of focus.
By replacing the Radiance Sensor with a Human Eye Sensor as show above, the distortion of the human eye will be inherently included in the result. Additionally, because pupil diameter and accommodation distance are fixed parameters, the depth of field is modeled via the optical ray trace rather than post-processing. This enables more precise results based on eye physiology, and also allows for accurate depth of field modeling through transparent surfaces such as AR glasses or windshields.
Taking the Model Further
The model presented here represents a virtual prototype of the AR device optical subsystem. Armed with such a virtual prototype, system validation beyond the component level becomes possible, including system-level tolerance or thermal analysis, optical crosstalk with other optical cameras or components, software development such as eye tracking to image processing pipelines, and so on.
One critical issue for current AR device technology is user comfort, especially regarding stereo vision. When the triangulated image points projected into the left and right eyes do not perfectly verge, the user perceives the error as either a depth misalignment (horizontal disparity) or a dipvergence (vertical disparity). The human vision system is particularly sensitive to the latter, since human eyes cannot independently move vertically making it a significant factor in eye strain, headache, nausea, etc. for AR device users. These types of effects, too, can be assessed using a virtual prototype through a combination of stereo image simulation in Speos, and post-processing to perform target recognition and triangulation.
Conclusion
This article demonstrates the capabilities of the Ansys optical software suite, particularly Ansys Zemax OpticStudio, Lumerical, and Speos, for designing and evaluating a grating-based RGB AR waveguide optomechanical system. An AR display system virtual prototype produced in this way can serve as the basis for system-level validation and assessment, reducing the need for physical prototypes.