Virtual Reality & Intelligent Hardware (Feb 2020)

Co-axial depth sensor with an extended depth range for AR/VR applications

  • Mohan Xu,
  • Hong Hua

Journal volume & issue
Vol. 2, no. 1
pp. 1 – 11

Abstract

Read online

Background: Depth sensor is an essential element in virtual and augmented reality devices to digitalize users' environment in real time. The current popular technologies include the stereo, structured light, and Time-of-Flight (ToF). The stereo and structured light method require a baseline separation between multiple sensors for depth sensing, and both suffer from a limited measurement range. The ToF depth sensors have the largest depth range but the lowest depth map resolution. To overcome these problems, we propose a co-axial depth map sensor which is potentially more compact and cost-effective than conventional structured light depth cameras. Meanwhile, it can extend the depth range while maintaining a high depth map resolution. Also, it provides a high-resolution 2D image along with the 3D depth map. Methods: This depth sensor is constructed with a projection path and an imaging path. Those two paths are combined by a beamsplitter for a co-axial design. In the projection path, a cylindrical lens is inserted to add extra power in one direction which creates an astigmatic pattern. For depth measurement, the astigmatic pattern is projected onto the test scene, and then the depth information can be calculated from the contrast change of the reflected pattern image in two orthogonal directions. To extend the depth measurement range, we use an electronically focus tunable lens at the system stop and tune the power to implement an extended depth range without compromising depth resolution. Results: In the depth measurement simulation, we project a resolution target onto a white screen which is moving along the optical axis and then tune the focus tunable lens power for three depth measurement subranges, namely, near, middle and far. In each sub-range, as the test screen moves away from the depth sensor, the horizontal contrast keeps increasing while the vertical contrast keeps decreasing in the reflected image. Therefore, the depth information can be obtained by computing the contrast ratio between features in orthogonal directions. Conclusions: The proposed depth map sensor could implement depth measurement for an extended depth range with a co-axial design. Keywords: Depth map sensor, 3D camera, Controlled aberration