IEEE Access (Jan 2019)

A Compatible Framework for RGB-D SLAM in Dynamic Scenes

  • Lili Zhao,
  • Zhili Liu,
  • Jianwen Chen,
  • Weitong Cai,
  • Wenyi Wang,
  • Liaoyuan Zeng

DOI
https://doi.org/10.1109/ACCESS.2019.2922733
Journal volume & issue
Vol. 7
pp. 75604 – 75614

Abstract

Read online

Localization and mapping in a dynamic scene is a crucial problem for the indoor visual simultaneous localization and mapping (SLAM) system. Most existed visual odometry (VO) or SLAM systems are based on the assumption that the environment is static. The performance of a SLAM system may degenerate when it is operated in a severely dynamic environment. The assumption limits the applications of RGB-D SLAM in the dynamic environment. In this paper, we propose a workflow to segment the objects accurately, which will be marked as the potentially dynamic-object area based on the semantic information. A novel approach for motion detection and removal from the moving camera is introduced. We integrate the semantics-based motion detection and the segmentation approach with an RGB-D SLAM system. To evaluate the effectiveness of the proposed approach, we conduct the experiments on the challenging dynamic sequences of TUM-RGBD datasets. The experimental results suggest that our approach improves the accuracy of localization and outperforms the state-of-the-art dynamic-removal-based SLAM system in both severely dynamic and slightly dynamic scenes.

Keywords