The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences (May 2022)

VISUAL ODOMETRY OF A MOBILE PALETTE ROBOT USING GROUND PLANE IMAGE FROM A FISHEYE CAMERA

  • U.-G. Lee,
  • S.-Y. Park

DOI
https://doi.org/10.5194/isprs-archives-XLIII-B1-2022-431-2022
Journal volume & issue
Vol. XLIII-B1-2022
pp. 431 – 436

Abstract

Read online

In this paper, we present a method of mobile robot’s visual odometry using the visual feature tracking in the ground plane image generated from the fisheye image. In order to extract the feature information on the ground, we use a fisheye camera that has a larger FOV than a general pinhole camera, so that we can capture more information on the ground plane. However, due to the large distortion, it is difficult to extract the visual features in the fisheye image. The distortion can be eliminated, but various problems arise, such as a decrease in the resolution of an image or losing the wide angle of the fisheye camera. We propose the EUCM-Cubemap projection model to convert the fisheye image into the cubemap image without losing the FOV of the fisheye image. And we create the Ground Plane Image, a virtual image that vertically looks at the ground from a cube map image. So, the ground plane image is generated so that it is captured from a virtual camera perpendicular to the ground. In the ground plane image, the motion vector obtained by feature tracking between previous and current frames is proportional to the actual robot’s motion in the 2D ground plane. Thus, if we know the actual scale of the motion vector, we can estimate the mobile robot’s velocity and steering angle on the virtual wheel generated by the ground plane image. The scale of the vector can be estimated using the position and focal length of the camera. Using these parameters, we estimate the mobile robot’s pose by applying the bicycle kinematic model. Experimental results show that the proposed method can replace other conventional odometry methods for mobile robots. And, in the future, it is expected to be used in a variety of fields such as visual-based control or path planning.