IEEE Access (Jan 2023)
Real-Time SLAM Based on Dynamic Feature Point Elimination in Dynamic Environment
Abstract
Slam (simultaneous localization and mapping) play an important role in the field of artificial and driverless intelligence. A real-time dynamic visual SLAM algorithm based on an object detection network is proposed to address the robustness and camera localization accuracy issues caused by dynamic objects in indoor dynamic scenes. The YOLOv5s model, which has the smallest depth and feature map width in the YOLOv5 series, is chosen as the object detection network. The backbone network is replaced with the lightweight ShuffleNetv2 network. Experimental results on the VOC2007 dataset show that the YOLOv5-LITE model reduces the network parameters by 41.89% and speeds up the runtime by 39.00% compared to the YOLOv5s model. A motion level division strategy is adopted to provide prior information to the object detection network. In the tracking thread of the visual SLAM system, a parallel thread combining the improved object detection network and multi-view geometry is introduced to eliminate dynamic feature points. The experimental results demonstrate that in dynamic scenes, the proposed algorithm improves the camera localization accuracy by an average of 85.38% compared to ORB-SLAM2. Finally, experiments in a real environment are conducted to validate the effectiveness of the algorithm.
Keywords