Nihon Kikai Gakkai ronbunshu (Oct 2021)

Real-time self-attitude estimation using visual images and/or structures of the environments

  • Ryota OZAKI,
  • Yoji KURODA

DOI
https://doi.org/10.1299/transjsme.21-00098
Journal volume & issue
Vol. 87, no. 903
pp. 21-00098 – 21-00098

Abstract

Read online

This paper presents a real-time self-attitude estimation method which utilizes the clues to the direction of the gravity hidden in images and structures of the environments. In the proposed method, the angular velocity is integrated using a gyroscope, a camera-based method estimates the gravity direction, and a LiDAR-based one also estimates the gravity direction, respectively. These estimations are integrated using the EKF (extended Kalman filter). The camera-based gravity direction estimation uses a DNN (deep neural network) which learns the regularity between the gravity direction and the landscape information. By learning the regularity, the proposed DNN can infer the gravity direction from only a single shot image. The DNN outputs the mean and variance to express uncertainty of the inference. The LiDAR-based gravity direction estimation extracts vertical planes from the surrounding environment measured by the LiDAR, and outputs the gravity direction based on their normals. By using both the camera and the LiDAR, more robust and accurate estimation can be achieved. To show the DNN can estimate the direction of gravity with uncertainty expression, static validations on test datasets are performed. Dynamic validations are also performed to show the proposed EKF-based method can estimate the attitude in real time. These validations are performed in both simulator and real world to compare the proposed method with conventional methods.

Keywords