International Journal of Applied Earth Observations and Geoinformation (Nov 2023)

Multi-sensor fusion for robust localization with moving object segmentation in complex dynamic 3D scenes

  • Qipeng Li,
  • Yuan Zhuang,
  • Jianzhu Huai

Journal volume & issue
Vol. 124
p. 103507

Abstract

Read online

Accurate and robust localization is one of the key modules of autonomous driving and multi-sensor robotic systems in complex urban scenes. Compared with camera-based localization, which is sensitive to the influence of light changes caused by different environments, LiDAR-based localization has higher accuracy and robustness. However, the existing LiDAR localization often assumes that the scene is static and ignores the impact of moving objects on localization and mapping. At present, globally consistent multi-sensor localization is an open issue, especially in the complex dynamic environment. We propose a LiDAR-based dynamic scene perception and multi-sensor localization framework. To improve the localization robustness in dynamic complex scenes, we design a moving object segmentation (MOS) module based on spatio-temporal information, and further optimize ground segmentation. On this basis, we design a lightweight, tightly-coupled LiDAR-IMU-GNSS odometry framework, which achieves accurate and real-time position estimation. In the front-end, based on the iterative Extended Kalman filter, we directly register the original 3D point cloud to the map without a feature extractor, which can adapt to different LiDAR scanning patterns, and then can be tightly-coupled with IMU data. Compared with the existing methods, we verify the proposed MOS performance on KITTI datasets, and our approach achieves state-of-the-art performance. In addition, we demonstrate the localization performance of the proposed framework on two public datasets and one solid-state LiDAR dataset. The results show that our system is more accurate than the existing advanced fusion-based localization method.

Keywords