IEEE Access (Jan 2023)

YG-SLAM: Enhancing Visual SLAM in Dynamic Environments With YOLOv8 and Geometric Constraints

  • Guoming Chu,
  • Yan Peng,
  • Xuhong Luo,
  • Jing Gong

DOI
https://doi.org/10.1109/ACCESS.2023.3342080
Journal volume & issue
Vol. 11
pp. 141421 – 141434

Abstract

Read online

In dynamic environments, achieving accurate and robust Visual SLAM (Simultaneous Localization and Mapping) remains a significant challenge, particularly for applications in robotic navigation and autonomous driving. This study introduces YG-SLAM, an innovative approach that integrates YOLOv8 and geometric constraints within the ORB-SLAM2 framework to adapt effectively to dynamic scenarios.YOLOv8 is employed for instance segmentation and dynamic object detection, enriching the semantic information while extracting image feature points. Geometric constraints, including epipolar geometry algorithms and Lucas-Kanade optical flow methods, are utilized to filter out dynamic objects effectively.The tracking thread exclusively relies on static feature points for camera pose estimation, substantially improving the system’s localization accuracy. Experimental results on the TUM dataset demonstrate that YG-SLAM significantly outperforms traditional ORB-SLAM2 in dynamic environments. Specifically, the Root Mean Square Error (RMSE) of the absolute trajectory errors reduced by 96.51% in comparison to ORB-SLAM2, and the RMSE of the relative pose errors decreased by 93.60% when compared to the performance of ORB-SLAM2.These notable reductions in errors demonstrate the promising performance enhancements of YG-SLAM over traditional ORB-SLAM2 in dynamic environments.

Keywords