At present, a majority of virtual reality (VR) technologies on the market employ static distortion correction by predistorting the virtual image. However, this compensation method is only effective when the pupil remains in a fixed position for virtual display device. When the pupil moves within the eye box of the VR device, the virtual image may deviate from the target position, rendering the compensation ineffective. Due to the optical asymmetry of the lens, different distortions can be perceived by the human eye as the pupil moves, which adversely affects the user's visual experience. Therefore, it is essential to measure and evaluate dynamic distortion for adjusting pre-compensation parameters according to the pupil's position, as well as for further optimizing optical systems with low dynamic distortion. In this paper, we analyzed the cause of dynamic distortion in virtual reality and proposed a novel method for characterizing dynamic distortion, allowing for quantitative analysis of dynamic distortion compared to traditional optical flow maps. A prototype was fabricated for dynamic distortion evaluation, and both simulation and measurement of the dynamic distortion were conducted. The results demonstrate a strong correlation between the simulations and measurements.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.