IEEE Access (Jan 2020)
Research on SLAM Algorithm of Mobile Robot Based on the Fusion of 2D LiDAR and Depth Camera
Abstract
This paper proposes a new Simultaneous Localization and Mapping (SLAM) method on the basis of graph-based optimization through the combination of the Light Detection and Ranging (LiDAR), RGB-D camera, encoder and Inertial Measurement Unit (IMU). It can conduct joint positioning of four sensors by taking advantaging of the unscented Kalman filter (UKF) to design the related strategy of the 2D LiDAR point cloud and RGB-D camera point cloud. 3D LiDAR point cloud information generated by the RGB-D camera under the 2D LiDAR has been added into the new SLAM method in the sequential registration stage, and it can match the 2D LiDAR point cloud and the 3D RGB-D point cloud by using the method of the Correlation Scan Matching (CSM); In the loop closure detection stage, this method can further verify the accuracy of the loop closure after the 2D LiDAR matching by describing 3D point cloud. Additionally, this new SLAM method has been verified feasibility and availability through the processes of theoretical derivation, simulation experiment and physical verification. As a result, the experiment shows that the multi-sensor SLAM framework designed has a good mapping effect, high precision and accuracy.
Keywords