PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.
Opening remarks from Symposium Chair Bernard Kress
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
One of the key challenges for augmented reality is the development of ultra-compact, lightweight, low-power near-to-eye display solutions with good image quality. This talk will review Laser Beam Scanning (LBS) technologies which can meet these key requirements and deliver form-factors enabling light weight, fashionable, all-day wearable AR smart glasses or HMDs with the ability to scale resolution and field-of-view (FoV).
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Augmented Reality (AR) Smart Glasses might be the next big thing in consumer electronics. These devices can be used as an add-on to the smartphone or -watch bringing the visual content to the line of sight. This gives comfort, provides safety and enables new kind of use cases and applications not doable with direct view mobile displays. It’s obvious that such glasses should be small, lightweight and smart looking and at the same time projecting a large, bright, colorful and high-resolution image. Optimizing all these characteristics at same time would be like the squaring of a circle, thus tradeoffs have to be made. Depending on the glasses use case, different technologies for light source, light modulator and combiner optics might be chosen. In this talk we present both LED and laser devices to fuel the future of AR smart glasses. Besides offering RGB LEDs which are well known from LCoS and DLP pico projectors we are also working on devices optimized for the use in AR glasses. In the field of lasers we developed a compact RGB module enabling Laser Beam Scanning (LBS) light engines with a volume of 1cc or less.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Dispelix is a global leader in Augmented Reality Waveguide Displays. Dispelix is a fabless company specializing in the design, development and mass manufacturing of diffractive waveguides for use in Augmented and Mixed Reality glasses and headsets. Our solutions revolutionize industrial and consumer AR wearables for today and our vision answers to the future demands of our customers and partners. With our answer to customized waveguide displays and licensing, we turn your AR wearable visions into reality.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This talk will look at applications AR and VR, with two main areas of research. The first is Computer-Genrated Hologram synthesis, with the goal of achieving the most realistic possible 3D rendering in real time. The second area of research is the compression and transmission of holographic data. These developments are currently built into a prototype holographic augmented reality headset.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Fashionable Augmented Reality Smartglasses require ultra-compact high performance display solutions of high brightness and extremely low power consumption. Laser Beam Scanning (LBS) offers some advantages over other microdisplay technologies particularly with respect to brightness, contrast, size, weight and power consumption. A biaxial piezoelectric MEMS scanning mirror that is operated in resonance in a miniature vacuum environment consequently minimizes power consumption and also allows reducing projector size to a minimum. MEMS mirror based laser scanning is not only the key to stylish AR smartglasses it also enables building extremely compact depth sensing cameras that are the basis for enabling interactivity – not only in the Metaverse
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
TriLite has developed the world’s smallest, lightest, and brightest laser beam scanner (LBS) for high-volume consumer AR applications to enable everyone to enjoy augmented vision as lightweight as the eyewear of today. Based on our ultra-compact RGB laser modules, class-leading 2D MEMS mirrors, and proprietary multi-parameter algorithms, our LBS are fully compatible to state-of-the-art waveguides without requiring any relay optics. Trixel® 3 has been designed from the ground up for mass manufacturing and is set out to be the first 2D LBS for AR in production.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Glass diffractive waveguides have gained significant importance in Augmented Reality (AR) near-eye display designs as they currently offer the best tradeoff between form factor and field-of-view. In this talk, we will discuss our continuous efforts to provide the best glass substrates that enable both wider field-of-view and thinner, lighter, and brighter devices. The impact of smaller image source exit pupils on glass waveguide designs will be discussed as the industry is driving toward further shrinkage of the optical engine. Both optical glass materials and substrate geometry will be discussed in relation to the expected display performances. We will introduce the latest generation of Corning glass substrates for AR and our current product roadm
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Waver-level nanoimprint lithography (NIL) represents an efficient and high precision nonconventional lithography method capable to replicate complex micro- and nano-scale structures. The flexibility of this method in imprinting different shape patterns on various substrate materials as well as the reliable fabrication of high-quality surfaces made NIL a key enabling technology for next generation devices and applications across the photonic industry. The results presented in this work give an overview of the flexibility in imprint resins, substrate materials and size and integration options possible by using UV-NIL.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
In optics, the surface determines the function. In classic optics you have material parameters, like refractive index, but the material shape / curvature (and roughness) determines the function. The advancement of nano-technology led to new methods to drastically lower fabrication costs per cm2 (compared to semicon approaches) and new insights in how electro-magnetic waves can be influenced at the fundamental level. That’s how nano-photonics was born. Nano-photonics allows us to control light with much more precision and functions than possible using conventional optics and enables new technologies such as diffractive waveguides and meta-lenses. The demands that these applications place on the patterns are not so easily met. The size and shape needs to be reproducible to an absolute size with variations less than 1 -2nm. Furthermore, the materials used need to have a high refractive index, preferably above n=2.0. The large area devices drive towards 300mm wafer processing to increase output and lower costs. To further decrease costs, directly patterning functional optical materials is needed to save 2 vacuum deposition steps and 3 vacuum dry-etch steps, both using expensive equipment. The industry is converging on nanoimprint lithography as a production technology that can address all these challenges. SCIL nanoimprint solutions is building on 20+ years material-, process- and tool building experience to provide customer specific high volume production solutions. Our approach has always been to start from the process and functional materials which allows us to optimize for stamp lifetime (500+), directly patterning fully inorganic functional materials (refractive index up to n=2.1), binary, blazed or slanted patterns with accuracy (less than 1nm absolute size variation). Our latest FabSCIL cluster tool offers processing of 200 and 300mm wafers, from 300 micron up to 2.5mm thickness, overlay lower than 1 micron even with alignment of patterns from the top to the bottom of the wafer. In the contribution we’ll elaborate on the material systems, reproducibility and production solutions.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
If AR smart glasses are to become the ‘next big thing’ and commonplace, their optics must be easier and more affordable to manufacture. Though optics may only be part of a total AR system, it plays a vital role as the window to the world for its wearer. The industry has already set the technical criterion for an acceptable AR optics solution in terms of efficiency and form factor. But the commercial viability of this complex optical product is still challenging. The solution to cost-effectively produce the waveguide combiners, is to switch the manufacturing mindset from a purely semiconductor-based one to one rooted in the realities and learnings of the display industry. We at Morphotonics strive to build the foundation for a future when large-area nanoimprinting can produce AR optical waveguide combiners in display volumes. In this talk, we will address the upscaling process of AR optics manufacturing via large-area nanoimprinting as well as the remaining challenges in terms of performance, materials, and manufacturability. Additionally, we will highlight the exemplary work recently undertaken with our industry partners to validate the entire value chain from design to mastering to replication on panel-level nanoimprint equipment using rectangular high refractive index glass substrates and high refractive index resins.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
In order to fabricate lightweight near eye displays for AR application, surface relief gratings are used for coupling the light from the source into the light guide and out of the light guide towards the eye. To suppress higher diffraction orders and thus maximize the light yield those gratings are slanted. Reactive Ion Beam Trimming (RIBT) was used for fabrication of surface relief gratings with both varying slant angle and varying trench depth on a master stamp. The substrate had been covered with a hard mask providing the basic grating geometry such as period and duty cycle. Localized ion beam etching was applied to etch the trenches for the grating. Varying the ion irradiation dose (by means of dwell time variation) defines the trench depth on a local level. The angle of incidence of the ion beam directly transfers into the substrate as the slant angle and can be varied independently of the dwell time, providing the unique possibility to achieve a continuous variation of the slant angle across one die. In a later step, nanoimprint lithography can be used for replication.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
We present our latest developments and results in making masters for replication of surface relief waveguides. Accurate control of the diffractive gratings in the input, expander and outcoupling (eyebox) areas are essential for good performing AR glasses. Advanced shapes of the gratings are used in many designs, and we here present results from masters and sub-masters having large areas with high precision, binary structures as well as both blazed and slanted gratings. Besides the waveguides used in in the displays, registration of the actual surroundings using 3D sensing is a very important feature to get the fullest out of the AR experience. We present latest developments of our compact optical solutions for 3D sensing
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This will discuss Goertek's a number of highly integrated, superior performance small system in package (SiP) solutions that can meet the requirements for small-size and functional integration into intelligent hardware systems, as well as the latest lineup in the high performance, high precision MEMS sensor series, including blood pressure, differential pressure, pneumatic and vibration sensors.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
We believe we are in the early stages of AR, and that the solutions to enable the user experience needed to drive mass adoption will be more sophisticated that currently available. This talk shares some of the technologies we are developing to enable the all-day augmented reality glasses we want to wear.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
HOYA is one of the very few elite-level wafer suppliers for AR applications. Their unparalleled expertise in optical glass and lenses technologies, their ability to supply high refractive index glass wafers, and decades of experience in developing and introducing advanced optical glass and lenses for global markets in a variety of fields bring a completely new dimension to the AR waveguide display market. In this talk, you will hear about HOYA’s group capabilities for serving the escalating AR, MR waveguide industry from multiple depths and angles.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Over the past 3-years the XR market has become awash with devices with the goal of delivering immersive user experiences. However, no-two XR devices are the same and each will require the ability to tune optical and mechanical properties to optimize device performance in the Metaverse. Pixelligent’s Designer Compounds™ built on our proprietary PixClearProcess®, provide the broadest design capability in terms of feature sizes, RI matching, mechanical properties, material platforms, and are a critical enabling material in waveguides, sensors, and metalenses.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
End users’ expectation to smartglasses is benchmarked by today’s close-to-perfect displays in smartphones and the fashionable wearability of smartwatches. Users expect a deceptively genuine mixed realty experience and uncompromised all-day wearability. Waveguide-based combiner optics are the technology of choice to get there. Different technologies, based on diffractive, reflective and holographic structures, are competing to most efficiently inject, distribute and extract the waveguided information. Driven by the commitment to define the industry’s reference of waveguide materials and components, SCHOTT develops and commercializes dedicated RealView® products to support developers and manufacturers of AR optics. Progress on benchmark performance in ever higher refractive index and lightweight materials is discussed in the context of the industry’s needs for wider Field-of-View, lower weight and high-volume manufacturability. Still, SCHOTT’s contribution to AR does not end here: your will learn more about optical wafer for sensors and meta optics, as well as on hermetic packages for semiconductor light sources
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Commercialization of NanoLED electroluminescent Quantum Dot displays may happen sooner than you think. Device efficiencies already approach or meet commercial requirements and lifetimes are improving rapidly. In this presentation, Jason Hartlove, Nanosys CEO, reports on recent breakthroughs in the development of NanoLED displays including Heavy Metal Free Quantum Dots, microLED and hybrid QD-OLED approaches.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
AGC Group, leading global material and solution provider, is providing several optical components for AR/MR market, such as glass wafer for waveguide and diffractive optical element or glass diffuser for censor application. For waveguide application, glass wafers with high refractive index and transmittance are required for wide field of view and bright image. New Glass wafer with M100 series of high refractive index have been developed and are just offering to the arket. Also, our studying a capability for optical simulation and waveguide-prototyping enables feedback to our materials development from AR/MR device view. In the presentation, we introduce our product lineup.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
We will talk about the advances in optical quality plastic carrier solutions, recording techniques, and optical isolation for integration of lightguide combiners in lenses using Bayfol® HX volume holographic optical elements.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Actuators are intrinsic in many high-performance optical systems. This presentation will discuss how actuators can improve XR camera and display performance, and in some cases even lower the overall system power. We will reveal why the unique material properties of Shape Memory Alloy (SMA) make it the ideal actuation technology for XR devices and how it is already used in 10s of millions of consumer devices already.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Magic Leap’s decade-long focus on premium AR has resulted in a revolutionary optical system. This presentation will provide a never before seen look at the advancements coming to market with the new Magic Leap 2 spatial AR headset including the novel projector architecture that results in the largest FOV delivered in a compact package. Precise eye tracking enables the largest, safe and comfortable display volume. Coupled with a new high performance CPU and GPU chipset and industry-leading eyepiece manufacturing process, Magic Leap 2 delivers a bright, crisp and uniform 3D display. Innovative dimming technology means developers can deliver solid virtual AR content in a wide range of ambient environments.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This is a combined presentation with 11932-71 (speaker: Qiming Li). The same video, containing both presentations, is published under both presentation records.
Consumers are still awaiting an all-day use smart glass solution without compromise. Key elements of the hardware ecosystem must be developed hand-in-hand to ensure that consumer expectations can be met. Loose alliances will not provide the desired results since individual technology roadmaps, interfaces, teams and budgets must have a clear and joint focus point to push the boundaries of AR/smart glasses technologies every day in real partnerships. tooz is forging strategic partnerships beyond its own technology boundaries to provide the best-in-class optical engine. The talk will provide exclusive insights about one of tooz’ latest strategic partnerships around the unique tooz curved waveguide with Rx integration.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
This is a combined presentation with 11932-25 (speaker: Frank-Oliver Karutz). The same video, containing both presentations, is published under both presentation records.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
It is common for technologists to push a single parameter to its extreme at the cost of creating a truly human usable product. In this talk, we will discuss ideal AR requirements and how the right tradeoffs are enabled by considering use-cases and the individuals that will use the produc
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
French startup Lynx has had a strong 2021 having redesigned its mixed reality (MR) headset earlier in the year followed by a successful Kickstarter that raised just over $800,000 USD. Founder Stan Larroque will give an update of Lynx-R1, a fully all-in-one (AIO) MR headset featuring a 2.1″ octagonal LCD panel, 1600 x 1600 per eye resolution at 90Hz with a 90° FOV, all powered by a Qualcomm XR2 chipset, 6GB RAM and 128GB internal storage that’s expandable via an SD card slot.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Headset-based AR/VR offers an immersive dive into these new digital worlds, but to many it still feels cumbersome and unfamiliar. As a result, mass adoption is still relatively slow. Lightfield displays offer a naturally immersive, “window-like” 3D look into the Metaverse while leaving users’ faces unencumbered. They can be readily deployed on familiar terminals, from smartphones / tablets to laptops or automotive displays. Better still, this method ensures compatibility with much of the existing digital content ecosystem, hence democratizing access to the Metaverse and potentially accelerating its deployment. In this talk, I will review our efforts at Leia to commercialize Lightfield-based mobile devices and our take on how to steadily ramp consumer adoption of the Metaverse.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
It's clear there's no "one size fits all" product for headworn XR. Different use cases have different requirements and tradeoffs that make sense for one yet fail spectacularly for another. Learn how Campfire has addressed challenges of current products for holographic collaboration. Starting with optics, Campfire has taken a very different approach to XR devices and applications to overcome barriers of the past in visual experience, ease-of-use, and workflow integration.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Hear from an XR Pioneer with roots at Apple, Echo Frames, Hololens, Second Life, Keyhole (became Google Earth), Disney and more. Imagine 10–20 years from now, we’ll each have a pair of contact lenses that can create AR and VR as well as we want (except maybe touch, taste, and smell). By then, the words “AR,” “VR,” and “Meta” will likely be relegated to academic writing and old-timey company branding in favor of something hip, now and organic.The future User Experience is a bit easier to project. Open your eyes and you’ll see 3D holograms in the real world perfectly mixed with real objects and people. Close your eyes (or otherwise elide the natural light) and you can be virtually anyplace else. Audio must also mix perfectly. But AR and VR are only two points on a spectrum. If you start with AR and add enough virtual stuff to distract you from reality, you’re effectively in VR. If you add digitized 3D “twins” or otherwise live camera feeds of your real-world environment into VR, you’re essentially back in AR again, or at least a simulation of it.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Metaverse, for better or worse, is the buzz word of the moment. But, what will the metaverse actually look like? What is needed to make it a strong and effective ecosystem—besides hardware, content and software, connectivity (5G), security, understanding of cultural and ethical impacts, and applications for enterprise. What is a myth about the metaverse, and what will be the reality. Join this panel and hear what these leaders have to say.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Invited talks 2b: Use Cases and Perceptual Research
Millions of people have visual impairments that cannot be corrected by conventional eyeglasses. Can emerging augmented reality technology eventually be used to create “smart eyeglasses” that digitally enhance visual information for people with impaired vision? I will present work in which we use existing augmented reality systems to design visual enhancements aimed at assisting with key visual functions, such as recognizing objects and navigating buildings, and assess these enhancements using perceptual and behavioral studies.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
NIST PSCR was allocated $300 million to support research for future first responder technologies. The UI/UX team are using Virtual and Augmented Reality (VR/AR) to test the usability and allow for rapid prototyping of new user interfaces. They've worked with law enforcement, fire service, and emergency medical services to come up with unique virtual environments and scenarios to support the development of visual aids, haptic feedback devices, and audio cue style interfaces. The UI/UX team has also awarded over $15 million in grants and $1.5 million in prizes to external collaborators supporting research and development for public safety use of VR/AR technologies. Scott will review past NIST VR and AR prize challenges and share key developments from the 2018 PSIAP-UI and 2020 PSIAP-AR grant awardees.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Use of mixed reality (MR) displays in education and training have become increasingly commonplace. However, the inherent vergence-accommodation conflict (VAC) causes concern that usage of these technologies may have an effect on vision. We investigated dynamic accommodation in young adults before and after performing an executive function task in MR (Hololens 2). We found large differences in accommodation before and after performing the task for some individuals. We hypothesize that these changes could be an effect of the VAC and that using MR over longer time periods could be disadvantageous for some, while for others it may be increasing flexibility in the vergence-accommodation system. This emphasizes the importance of understanding individual visual characteristics when incorporating virtual displays into education and training.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
As businesses leaders respond to the impact of COVID-19 on their routines, customers and success structure, smart tech solutions must be integrated with mindfulness of the interconnected nature of the economy, society, and a connected world. As we respond to the threat of disease impacting our daily routines, customers and businesses, we have much to learn from the integral nature of past challenges. The importance of hygiene within the industry has only recently been realized, and now we have a valuable solution.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Many augmented reality (AR) displays cannot dynamically map physical environments. This may result in positioning errors when virtual content should be occluded by an intervening physical object, but remains visible. We evaluated how such conflicts impact viewers’ ability to localize virtual objects using a distance matching task in AR. We find accurate matches when the virtual target is positioned at or in front of a physical surface. However when rendered beyond the surface, target position is underestimated and errors increase systematically with distance. If generalizable, these results will be useful for predicting patterns of spatial errors in AR user interactions.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The study of visual perception has typically relied on psychophysical experiments presenting stimuli on fixed displays and viewed from a restrained head position. While these tightly controlled methods are useful for fundamental perceptual research, the results gained from them may not generalise well to everyday human experience. XR has the potential to be a powerful methodological tool for studying visual perception under more ecologically valid conditions. However, XR does not perfectly recreate natural viewing conditions and sensory cues. The utility of XR for perceptual research relies on researchers understanding its perceptual limitations and using it appropriately. In this talk I will provide an overview of these perceptual limitations as well as discuss the research areas in which XR might be particularly impactful.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Metasurfaces form a powerful approach to realizing compact and lightweight optical elements, which can be integrated into smart glasses and head-mounted displays. In this talk, we explore the opportunities that arise with electronically programmable active metasurfaces, which are simultaneously modulated in both space and time. Using electro-optical effects, our group has previously demonstrated metasurfaces that control the spatial features of light. By doing so, we have been able to realize multifunctional optical elements that can achieve beam focusing and steering with a high signal-to-noise ratio.1,2 However, in this quasi-static operation regime, the applied signal is not varied in time. The introduction of time modulation additionally allows the creation of higher-order frequency harmonics that provide control over the spectral content of the scattered light. We implement time-modulated metasurfaces by integrating an indium tin oxide (ITO) based, electro-optically tunable metasurface operating at 1550 nm into a radiofrequency network. Each metasurface element is modulated at up to 100 MHz to generate frequency harmonics that are well separated from the central frequency. With the use of additional nonresonant phase shifters, we engineer space-time modulated wavefronts that allow us to control a four-dimensional design space. Finally, we demonstrate a metasurface architecture consisting of interdigitated subarrays that are independently controlled using distinct spatiotemporal phase fronts. With this, we are able to demonstrate simultaneous and independent shaping of beams at distinct frequencies using a single chip. We foresee that this technology will have direct implications on the future of multi-channel optical communication networks used in AR/VR systems.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Wearable AR devices are currently following a well-trodden route to market, starting with industrial and military applications before expanding into enterprise. However, consumer adoption of AR has stumbled upon significant hurdles: inadequate hardware and lack of compelling everyday applications, beyond smartphone-style functionalities. Computer-Generated Holography (CGH) has the potential to address both, enabling 3D display architectures with compact optics, and unlocking gaming - a huge consumer use case for AR. In this talk, we discuss key challenges and opportunities for gaming applications in AR wearables, and VividQ’s recent technological innovations in CGH, including 3D waveguides, that enable them.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Holographic near-eye displays promise to deliver full 3D experiences in a novel and potentially very thin/small optical layout and have made remarkable progress over the last few years. Unlike conventional displays, holographic displays are based on phase-only spatial light modulators (SLMs) which can show 3D images by shaping a wave field such that the target image is created through interference. However, holographic display have remained only in the research field because of the computational complexity, low image quality, and bulky optics. In this presentation, I will introduce recent research efforts from Nvidia and Stanford researchers to solve above problems with learning-based approaches. For example, optimizing phase patterns in real-time with a learned wave propagation model (Neural Holography), speckle reduction with partially coherent light sources (Speckle-free Holography), higher contrast with an additional SLM and camera-in-the-loop optimization (Michelson Holography), and optimizing high diffraction orders without optical filtering for compact holographic displays (Unfiltered Holography).
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
From cameras to displays, visual computing systems are becoming ubiquitous in our daily life. However, their underlying design principles have stagnated after decades of evolution. Existing imaging devices require dedicated hardware that is not only complex and bulky, but also exhibits only suboptimal results in certain visual computing scenarios. This shortcoming is due to a lack of joint design between hardware and software, importantly, impeding the delivery of vivid 3D visual experience of displays. By bridging advances in computer science and optics with extensive machine intelligence strategies, my work engineers physically compact, yet functionally powerful imaging solutions of cameras and displays for applications in photography, wearable computing, IoT products, autonomous driving, medical imaging, and VR/AR/MR. In this talk, I will describe two classes of computational imaging modalities. Firstly, in Deep Optics, we jointly optimize lightweight diffractive optics and differentiable image processing algorithms to enable high-fidelity imaging in domain-specific cameras. Additionally, I will discuss Neural Holography, which also applies the unique combination of machine intelligence and physics to solve long-standing problems of computer-generated holography. Specifically, I will describe several holographic display architectures that leverage the advantages of camera-in-the-loop optimization and neural network model representation to deliver full-color, high-quality holographic images. Driven by trending machine intelligence, these hardware-software jointly optimized imaging solutions can unlock the full potential of traditional cameras and displays and enable next-generation visual computing systems.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The automotive industry recognizes the vibrant development of augmented reality (AR) technology and identifies that it will be a great enhancement for the business from design, manufacturing, safety, and user experience perspectives. Recently, AR head-up displays (HUDs) are showcased with fascinating and useful applications. Several automotive OEMs advertise their intent to implement AR HUDs in future programs. In this presentation, we will focus on HUDs, provide an overview of AR HUD use cases that enhance driver experience, the requirements to create AR experience, and review the enabling technology. Specifically, we will review and describe the key elements of phase holography and the use of this powerful technology to achieve a compact optical system design.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Virtual and augmented reality gadgets need to map surrounding, locate objects and track gestures to create surreal and engaging experiences. And, to be ubiquitous, they need to do so over larger spaces in bright outdoor conditions using as few as possible tiny, low-cost and low-power sensors. Join this talk to see how Lumotive’s Meta-Lidar technology makes it possible with a single sensor — making AR/R experiences real.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Holographic optical elements (HOEs) can not only replace traditional lenses and mirrors but can also provide optical functions that are very difficult to achieve with conventional optics. HOEs are smaller, lighter, cheaper and better tailored to customer needs than conventional optics. All of the optical workings of HOEs are compressed into a thin nanostructured layer of recording material. META’s ARfusion™ platform combines precision cast lens fabrication tools with functional metasurfaces, including HOE’s, to provide AR wearable developers with a platform for seamlessly integrating smart technologies into thin lightweight prescription glasses. In this talk I will describe how these embedded smart technologies can include waveguides, micro-displays, holographic filters, polarizing, liquid crystal/electrochromic foils, antifogging and eye-glow suppression films, antennas, and eye trackers.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Grace Hwang leads a cross-disciplinary team at Microsoft that explores what the future of 5-year-old HoloLens mixed-reality project looks like—a glowing, secondary world of holographic objects and people that can appear in an ordinary home or office. This year, the company debuted its Mesh platform—a virtual conference room where people can meet not just via HoloLens but also through an Android tablet, iPhone, or Oculus VR headset. Current customers include Accenture and Ray and Mark Dalio’s non-profit, OceanX.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Today it is possible to 3D print lenses in volume, enabling unique benefits for traditional prescription lenses and for smart eyewear. With 3D printing lenses can be printed with spheres, aspheres, cylinders, prisms, bifocals, trifocals, progressives and anything in-between. It is possible to integrate devices inside these prescription lenses for example, waveguides, films, sensors, lcd or anything else you can think off. It is possible to create airgaps or to print directly on the device. The device can be glass or plastic. The latest innovation is that we can print not only on flat but also on curved surfaces.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Morrow’s Autofocal eyewear tackles the intrinsic trade off multifocal/progressive eyewear users are faced with, being higher reading power and reduced field of view. In the first part of this talk, we’ll introduce the integration challenges that were overcome to make conventional prescription glasses ’smart’, and explain how our product is kept simple but meticulously geared towards the needs of our customers. In the second part, we’ll reflect on the potential roadmap towards more functionalities, and how a collaborative crossover with AR/VR parties might look like.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The past several years have seen significant progress in the design of robust, scalable mixed reality systems. During that time, considerable investments have been made in the development of diffractive technologies for mixed reality. Nonetheless, catadioptric solutions are still a viable, if not preferred, alternative for many applications. Simulation is critical to determining the best technology for the desired application, and advanced technologies – such as Rigorous Coupled Wave Analysis – are required to simulate both solutions (diffractive vs. catadioptric). In this talk, I will highlight the use of these advanced technologies in the design and development of next-generation mixed reality systems.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The move towards a more digital and contactless world will increase the demand for holographic optics as the core optic in a pair of glasses or car windshields. How will we do that? Find out in this talk from the CEO of Trulife Optics
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
The presentation shows how dynamic aberrations correction can increase the apparent resolution, eliminate color fringing and pupil swim effects, and enlarge the eye-box in the highest-end displays, such as Varjo VR-1 and HP Reverb G2.Having to achieve high resolution, wide field of view, and large eye-box, the VR/AR head-mounted display makers face the challenges impossible to overcome by hardware design alone. Even the latest-and-greatest devices retain the common flaws spoiling user experience: blur and color fringing outside of the small “sweet spot,” picture quality degradation and geometry distortion at wide gaze angles, and a tiny eye box. In order to achieve realistic picture quality and natural visual experience, the rendering pipeline has to include advanced image pre-processing beyond the standard geometry warp and channel scaling. Almalence created the Digital Lens, a computational solution utilizing a precise characterization of HMD optical properties along with a dynamic aberration correction technique, adjusting on the fly to the eye-tracking data.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
AR headsets are being used to improve efficiency in staff training, manufacturing, production and design, etc. As it evolves, AR will provide immersive step by step instructions for technicians, leading to time saving and cost reduction. AR makes work more accurate and work environment safer through engaging simulation and training of different scenarios. On the consumer side, it’s mostly mobile AR that prevails in mass market. The major factor which are fueling the AR market growth of consumers is worldwide high penetration of smartphone and mobile devices. Which going to win?
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Join this panel and hear what these experts think are the next steps for prescription smartglasses. Roughly 126 million Americans wear eyeglasses, and they pay an average of $576 a year for the privilege. Given the trend towards smart devices elsewhere, it should come as no surprise that prescription smart glasses are next. What is it really going to look like? Come find out.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Since 2019 Lightpace has been working on development of multi focal near eye displays. Multi layer liquid crystal optical diffusing switch has been invented by a company few years back. During speech newest design of multilayer near eye flat display module will be presented that enables the development of more miniature AR glasses.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
We have successfully developed FOV 60+ degree waveguide and 1.2 cc size micro projector for AR glasses. Larger FOV will deliver more immersive experiences and the ultra-compact projector is compact enough to be embedded to flames of glasses.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Since 2016 Dispelix has been developing diffractive waveguides for near to eye displays. The road has seen setbacks, bumps and evolution along the way. Diffractive waveguides suffer from many inherent challenges as do others. Are diffractive waveguides mature enough for consumer adoption? Key benefits and challenges of diffractive waveguides in respect of all day wearables will be discussed.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Eyes are by far our main input from the real and digital worlds. The future of Augmented Reality therefore vitally depends on a display technology that can merge the two worlds into one. The real world is however 3D while today’s “3D displays” are not - they lack focal depth. True 3D needs a display revolution. Light-field does it. Contrary to widespread notion, light-field can provide genuinely 3D imagery while it is equally or more efficient as an AR display than conventional raster displays. This talk will introduce why and how.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
T-Glasses is LetinAR’s truly eyeglasses-shaped lightweight augmented reality device which uses plastic optical combiners. Thanks to its unique PinTILT (Total Internal reflection minimized Lightguide Technology) structure, it shows moderate optical manufacturing tolerance allowing injection molding suitable for low-cost mass production. In this talk, Dr. Park introduces the latest advancement of PinMR (Pin Mirror) optical structure: PinTILT, and how it can minimize the number of total internal reflections by adopting a modified birdbath structure, which results in the minimal form factor of AR glasses.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
We will present the reflective waveguide technology and its advantages and fit for Near to Eye Displays for Consumer AR glasses. 2D pupil expansion reflective waveguide technology, provides a small waveguide entrance pupil and allows for a compact projector module that can be easily integrated into the glasses frame. The projector together with the waveguide assembly provide a high-resolution crisp image and significant advantages in the areas of power efficiency, color uniformity, light leakage, and glare free view of the physical world.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Waveguide displays provide a transparent optical solution to deliver a digital layer to augment the physical world. Consideration is given to the importance of artifacts that arise within waveguide displays that affect social acceptability, the importance of which is critical for the mass adoption of waveguide displays, particularly in the consumer market space. This includes consideration of optical see-though transmission, eye glow, and the causes and impacts other glints an interactions of stray light both within the waveguide and from external illumination sources. Particular consideration is given to DigiLens’ high efficiency holographic waveguide displays with which the author is most familiar, with comparisons drawn against other technologies in the space.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
With the development of augmented reality technology, In the future, what kind of technology can meet the needs of users for performance? Consumer AR is coming, two-dimensional reflective waveguide technology can be expected in the future. Compared with one-dimensional pupil dilation, it has many advantages, such as a larger field of view, a smaller optical machine, and a thinner body. In 2021, Lingxi AR has successfully developed a two-dimensional optical module with a 60° field of view, and vigorously promoted mass production. It will promote the rapid development of consumer AR.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
To date, AR companies have struggled with creating beautiful and stylish products while maintaining performance and manufacturability. The AG-30L and AG-50L addresses these challenges through Avegant's proprietary, unique optical and illumination technologies to deliver high performance in a manufacturable way. The combination of these features will enable the creation of a new generation of AR glasses with an ultra-compact form factor. We've seen considerable excitement in the industry around our unique light engines because they are significantly lighter, smaller, and perform to our customers' specifications in efficiency, resolution, and contrast.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Photonics is one of the core technologies of Sony and is the foundation of the core devices which create the values of Reality, Realtime and Remote. In this talk, Sony's unique photonic and display devices for XR are introduced, including micro-display, visible VCSEL and novel display technologies. The prospects for the evolution of these technologies will also be presented.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Nreal, founded in 2017, is dedicated to realizing consumer-facing Mixed Reality (MR) user experiences around the world. Nreal's flagship Nreal Light MR glasses are lightweight, comfortable and sport a wide Field of View (FoV). First unveiled at 2019's Consumer Electronics Show (CES) in Las Vegas, Nreal Light was unanimously critically acclaimed for its wearability and digital interactions. Nreal was awarded Engadget's "Best of CES Award" and the first Chinese startup to win this since the award's inception.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Micro LEDs have high brightness and power-saving required to make AR wearables accepted by the masses. However, there are a variety of challenges on using uLEDs in AR products (epitaxy, chip size, transfer technology, wafer bonding, color conversion and inspection, etc.).
Hear from leading industry experts about the latest scientific advancements and market trends.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
Extended reality (XR) technologies that enhance and replace our view of the world are vividly moving towards mainstream as supporting professional and consumer applications approach commerciality. Majority of devices that are enabling XR content heavily rely on optics. However, making optics good enough to meet demanding user expectations remains difficult in terms of optical quality, cost, throughput and even in a sustainable point of view. Additive manufacturing processes are readily adopted in XR optics helping close the difficulty gap. This presentation covers new additive processed materials for XR devices related optics with their possibilities and challenges.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.