ROBOMECH Journal (Jun 2020)

Trajectory estimation and position correction for hopping robot navigation using monocular camera

  • Gabor Kovacs,
  • Yasuharu Kunii,
  • Takao Maeda,
  • Hideki Hashimoto

DOI
https://doi.org/10.1186/s40648-020-00172-3
Journal volume & issue
Vol. 7, no. 1
pp. 1 – 14

Abstract

Read online

Abstract In this paper, a navigation and environment mapping method is presented for small exploration robots that use hopping motion. While previous research about hopping rovers mostly focuses on mobility and mechanical design, the motivation for the proposed method is to provide a fully autonomous navigation system using only a monocular camera. The method accurately estimates the hopping distance and reconstruct the 3D environment using Structure from Motion, proving that a monocular system is not only feasible, but accurate and robust at the same time. The relative scale problem of the reconstructed scene and trajectory is solved by the known gravity and parabolic motion constraints. After each hop, the error in landing position is corrected by a modified Iterative Closest Point algorithm with non-overlapping part elimination. The environmental point cloud is projected onto a 2D image, that is used to find the most suitable landing position for the next hop using protrusion based obstacle detection, and navigate the robot towards the goal direction. Both virtual environment simulations and real experiments confirm the feasibility and highlight the advantages of the presented method.

Keywords