Abstract
In this work, which is part of the U.S. Air Force School of Aerospace Medicine Operational Based Vision Assessment program, a high resolution stereoscopic head-mounted display (HMD), X-Plane image generators, and an OptiTrack head-tracking system were used to render an immersive three-dimensional constructive environment. The purpose of this effort was to quantify the impact of aircrew vision on an operationally relevant rotary wing call-to-landing task to research the applicability of U.S. Air Force Flying Class III depth perception standards. Prior to performing this research, an evaluation was carried out to determine whether our simulated environment could support eye-limited stereoscopic disparity. This paper details the psychometric validation of the stereoscopic rendering of a virtual environment using game-based simulation software in a high resolution HMD. The minimum perceived stereo threshold capabilities of this system are also quantified, including its applicability to simulated tasks requiring precise depth discrimination. This work will provide an example validation method for future stereoscopic virtual immersive environments applicable to both research and training.