Applied Sciences (Apr 2024)
PSMD-SLAM: Panoptic Segmentation-Aided Multi-Sensor Fusion Simultaneous Localization and Mapping in Dynamic Scenes
Abstract
Multi-sensor fusion is pivotal in augmenting the robustness and precision of simultaneous localization and mapping (SLAM) systems. The LiDAR–visual–inertial approach has been empirically shown to adeptly amalgamate the benefits of these sensors for SLAM across various scenarios. Furthermore, methods of panoptic segmentation have been introduced to deliver pixel-level semantic and instance segmentation data in a single instance. This paper delves deeper into these methodologies, introducing PSMD-SLAM, a novel panoptic segmentation assisted multi-sensor fusion SLAM approach tailored for dynamic environments. Our approach employs both probability propagation-based and PCA-based clustering techniques, supplemented by panoptic segmentation. This is utilized for dynamic object detection and the removal of visual and LiDAR data, respectively. Furthermore, we introduce a module designed for the robust real-time estimation of the 6D pose of dynamic objects. We test our approach on a publicly available dataset and show that PSMD-SLAM outperforms other SLAM algorithms in terms of accuracy and robustness, especially in dynamic environments.
Keywords