智慧农业 (May 2024)

Localization Method for Agricultural Robots Based on Fusion of LiDAR and IMU

  • LIU Yang,
  • JI Jie,
  • PAN Deng,
  • ZHAO Lijun,
  • LI Mingsheng

DOI
https://doi.org/10.12133/j.smartag.SA202401009
Journal volume & issue
Vol. 6, no. 3
pp. 94 – 106

Abstract

Read online

ObjectiveHigh-precision localization technology serves as the crucial foundation in enabling the autonomous navigation operations of intelligent agricultural robots. However, the traditional global navigation satellite system (GNSS) localization method faces numerous limitations, such as tree shadow, electromagnetic interference, and other factors in the agricultural environment brings challenges to the accuracy and reliability of localization technology. To address the deficiencies and achieve precise localization of agricultural robots independent of GNSS, a localization method was proposed based on the fusion of three-dimensional light detection and ranging (LiDAR) data and inertial measurement unit (IMU) information to enhance localization accuracy and reliability.MethodsLiDAR was used to obtain point cloud data in the agricultural environment and realize self-localization via point cloud matching. By integrating real-time motion parameter measurements from the IMU with LiDAR data, a high-precision localization solution for agricultural robots was achieved through a specific fusion algorithm. Firstly, the LiDAR-obtained point cloud data was preprocessed and the depth map was used to save the data. This approach could reduce the dimensionality of the original LiDAR point cloud, and eliminate the disorder of the original LiDAR point cloud arrangement, facilitating traversal and clustering through graph search. Given the presence of numerous distinct crops like trees in the agricultural environment, an angle-based clustering method was adopted. Specific angle-based clustering criteria were set to group the point cloud data, leading to the segmentation of different clusters of points, and obvious crops in the agricultural environment was effectively perceived. Furthermore, to improve the accuracy and stability of positioning, an improved three-dimensional normal distribution transform (3D-NDT) localization algorithm was proposed. This algorithm operated by matching the LiDAR-scanned point cloud data in real time with the pre-existing down sampled point cloud map to achieve real-time localization. Considering that direct down sampling of LiDAR point clouds in the agricultural environment could result in the loss of crucial environmental data, a point cloud clustering operation was used in place of down sampling operation, thereby improving matching accuracy and positioning precision. Secondly, to address potential constraints and shortcomings of using a single sensor for robot localization, a multi-sensor information fusion strategy was deployed to improve the localization accuracy. Specifically, the extended Kalman filter algorithm (EKF) was chosen to fuse the localization data from LiDAR point cloud and the IMU odometer information. The IMU provided essential motion parameters such as acceleration and angular velocity of the agricultural robot, and by combining with the LiDAR-derived localization information, the localization of the agricultural robot could be more accurately estimated. This fusion approach maximized the advantages of different sensors, compensated for their individual limitations, and improved the overall localization accuracy of the agricultural robot.Results and DiscussionsA series of experimental results in the Gazebo simulation environment of the robot operating system (ROS) and real operation scenarios showed that the fusion localization method proposed had significant advantages. In the simulation environment, the average localization errors of the proposed multi-sensor data fusion localization method were 1.7 and 1.8 cm, respectively, while in the experimental scenario, these errors were 3.3 and 3.3 cm, respectively, which were significantly better than the traditional 3D-NDT localization algorithm. These findings showed that the localization method proposed in this study could achieve high-precision localization in the complex agricultural environment, and provide reliable localization assistance for the autonomous functioning of agricultural robots.ConclusionsThe proposed localization method based on the fusion of LiDAR data and IMU information provided a novel localization solution for the autonomous operation of agricultural robots in areas with limited GNSS reception. Through the comprehensive utilization of multi-sensor information and adopting advanced data processing and fusion algorithms, the localization accuracy of agricultural robots could be significantly improved, which could provide a new reference for the intelligence and automation of agricultural production.

Keywords