IEEE Access (Jan 2023)

A Dense Visual SLAM Method in Dynamic Scenes

  • Xia Liu,
  • Jiachen Zhang,
  • Bo Wang,
  • Qifeng Jiang

DOI
https://doi.org/10.1109/ACCESS.2023.3332691
Journal volume & issue
Vol. 11
pp. 138530 – 138539

Abstract

Read online

The visual SLAM algorithm has been developed to be more mature. It can achieve good performance in most static scenes, but when it is applied to dynamic scenes, it will be interfered with by a large number of dynamic points in the scene, which affects the system accuracy and even leads to tracking failures, and the sparse maps built by traditional SLAM algorithms can not satisfy advanced needs such as navigation. Aiming at the above problems, this paper proposes a dense visual SLAM method in dynamic scenes. In this paper, based on the ORB-SLAM2 algorithm, we use the improved balanced quadtree method to obtain uniformly distributed feature points, use the motion grid statistics method to improve the accuracy of feature matching between neighboring frames, add a new target detection network to obtain the a priori information of the dynamic objects, and then construct a dense map containing only static information in the dense map building thread. Finally, to verify the effectiveness of the algorithms in this paper, the TUM dataset is used to compare with the algorithms of ORB-SLAM2, ORB-SLAM3, DS-SLAM, and RDS-SLAM. The results show that this paper’s SLAM algorithm improves 97.54%, 97.35%, 28.74%, and 17.37% on the root mean square error of the absolute trajectory error compared to the four algorithms in the measured high-dynamic sequences, respectively.

Keywords