Opto-Electronic Science (Sep 2024)

Edge enhanced depth perception with binocular meta-lens

  • Xiaoyuan Liu,
  • Jingcheng Zhang,
  • Borui Leng,
  • Yin Zhou,
  • Jialuo Cheng,
  • Takeshi Yamaguchi,
  • Takuo Tanaka,
  • Mu Ku Chen

DOI
https://doi.org/10.29026/oes.2024.230033
Journal volume & issue
Vol. 3, no. 9
pp. 1 – 10

Abstract

Read online

The increasing popularity of the metaverse has led to a growing interest and market size in spatial computing from both academia and industry. Developing portable and accurate imaging and depth sensing systems is crucial for advancing next-generation virtual reality devices. This work demonstrates an intelligent, lightweight, and compact edge-enhanced depth perception system that utilizes a binocular meta-lens for spatial computing. The miniaturized system comprises a binocular meta-lens, a 532 nm filter, and a CMOS sensor. For disparity computation, we propose a stereo-matching neural network with a novel H-Module. The H-Module incorporates an attention mechanism into the Siamese network. The symmetric architecture, with cross-pixel interaction and cross-view interaction, enables a more comprehensive analysis of contextual information in stereo images. Based on spatial intensity discontinuity, the edge enhancement eliminates ill-posed regions in the image where ambiguous depth predictions may occur due to a lack of texture. With the assistance of deep learning, our edge-enhanced system provides prompt responses in less than 0.15 seconds. This edge-enhanced depth perception meta-lens imaging system will significantly contribute to accurate 3D scene modeling, machine vision, autonomous driving, and robotics development.

Keywords