IEEE Access (Jan 2019)

Multi-Sensor Depth Fusion Framework for Real-Time 3D Reconstruction

  • Muhammad Kashif Ali,
  • Asif Rajput,
  • Muhammad Shahzad,
  • Farhan Khan,
  • Faheem Akhtar,
  • Anko Borner

DOI
https://doi.org/10.1109/ACCESS.2019.2942375
Journal volume & issue
Vol. 7
pp. 136471 – 136480

Abstract

Read online

For autonomous robots, 3D perception of environment is an essential tool, which can be used to achieve better navigation in an obstacle rich environment. This understanding requires a huge amount of computational resources; therefore, the real-time 3D reconstruction of surrounding environment has become a topic of interest for countless researchers in the recent past. Generally, for the outdoor 3D models, stereo cameras and laser depth measuring sensors are employed. The data collected through the laser ranging sensors is relatively accurate but sparse in nature. In this paper, we propose a novel mechanism for the incremental fusion of this sparse data to the dense but limited ranged data provided by the stereo cameras, to produce accurate dense depth maps in real-time over a resource limited mobile computing device. Evaluation of the proposed method shows that it outperforms the state-of-the-art reconstruction frameworks which only utilizes depth information from a single source.

Keywords