International Journal of Advanced Robotic Systems (Feb 2022)

Robust monocular 3D object pose tracking for large visual range variation in robotic manipulation via scale-adaptive region-based method

  • Jiexin Zhou,
  • Zi Wang,
  • Yunna Bao,
  • Qiufu Wang,
  • Xiaoliang Sun,
  • Qifeng Yu

DOI
https://doi.org/10.1177/17298806221076978
Journal volume & issue
Vol. 19

Abstract

Read online

Many robot manipulation processes involve large visual range variation between the hand-eye camera and the object, which in turn causes object scale change of a large span in the image sequence captured by the camera. In order to accurately guide the manipulator, the relative 6 degree of freedom (6D) pose between the object and manipulator is continuously required in the process. The large-span scale change of the object in the image sequence often leads to the 6D pose tracking failure of the object for existing pose tracking methods. To tackle this problem, this article proposes a novel scale-adaptive region-based monocular pose tracking method. Firstly, the impact of the object scale on the convergence performance of the local region-based pose tracker is meticulously tested and analyzed. Then, a universal region radius calculation model based on object scale is built based on the statical analysis result. Finally, we develop a novel scale-adaptive localized region-based pose tracking model by merging the scale-adaptive radius selection mechanism into the local region-based method. The proposed method adjusts local region size according to the scale of the object projection and achieves robust pose tracking. Experiment results on synthetic and real image sequences indicate that the proposed method achieves better performance over the traditional localized region-based method in manipulator operation scenarios which involve large visual range variation.