IEEE Access (Jan 2020)

Integrated Autonomous Relative Navigation Method Based on Vision and IMU Data Fusion

  • Wenlei Liu,
  • Sentang Wu,
  • Yongming Wen,
  • Xiaolong Wu

DOI
https://doi.org/10.1109/ACCESS.2020.2978154
Journal volume & issue
Vol. 8
pp. 51114 – 51128

Abstract

Read online

An integrated autonomous relative navigation method based on vision and IMU data fusion was proposed in this paper, which can improve the position accuracy effectively and has strong adaptability to environmental changes. Firstly, IMU pre-integration formula based on Runge Kutta method was derived, which can improve the pre-integration position accuracy and reduce the accumulated error effectively. Secondly, an inverse depth estimation method based on the mixed probability model was proposed during the system initialization process, which can improve the accuracy of camera depth estimation and provide better initial conditions for back-end optimization. Thirdly, a sliding window filtering method based on the probability graph was proposed, which can avoid repeated calculations and improve the sliding window filtering efficiency. Forthly, combined with the advantages of the direct method and the feature point method, a mixed re-projection optimization method was proposed, which can expand the application scope of the method and improve the optimization accuracy effectively. Finally, in the closed-loop optimization, a closed-loop optimization method based on similar transformation is proposed to eliminate the accumulated error. In order to verify the environmental adaptability of the method and the impact of closed-loop detection on the relative navigation system, indoor and outdoor experiments were carried out with a hand-held camera and an IMU. EuRoC dataset was used in the experiments and the proposed method was compared with some classical methods. The experimental results showed that this method has high accuracy and robustness.

Keywords