Navigation (Aug 2023)

An Embedded High-Precision GNSS-Visual-Inertial Multi-Sensor Fusion Suite

  • Cheng Liu,
  • Shuai Xiong,
  • Yongchao Geng,
  • Song Cheng,
  • Fang Hu,
  • Bo Shao,
  • Fang Li,
  • Jie Zhang

DOI
https://doi.org/10.33012/navi.607
Journal volume & issue
Vol. 70, no. 4

Abstract

Read online

Because of the high complementarity between global navigation satellite systems (GNSSs) and visual-inertial odometry (VIO), integrated GNSS-VIO navigation technology has been the subject of increased attention in recent years. In this paper, we propose an embedded high-precision multi-sensor fusion suite that includes a multi-frequency and multi-constellation GNSS module, a consumption-grade inertial measurement unit (IMU), and a grayscale camera. The suite uses an NVIDIA Jetson Xavier NX as the host and develops a Field Programmable Gate Array-based controller for hardware time synchronization between heterogeneous sensors. A multi-state constraint Kalman filter is used to generate the tightly-coupled estimation from the camera and the IMU. As a result, the GNSS output is loosely coupled to facilitate the acquisition of the global drift-free estimation. Results from the calibration reveal that the time synchronization accuracy of the suite is better than 30 µs (standard deviation [STD]) and that the projection error of camera-IMU is less than 0.1 pixels (STD); these results highlight the advantage of this hardware time synchronization mechanism. Results from the vehicle-mounted tests reveal reductions in the three-dimensional (3D) positioning error from 8.455 m to 5.751 m (root mean square) on experimental urban roads, which significantly improves the accuracy and continuity of GNSS individual positioning. In underground sites where the satellite signal is completely unavailable, the 3D position error drift of the suite is only 1.58 ‰, which also shows excellent performance.