Sensors (Feb 2023)

Robot Programming from a Single Demonstration for High Precision Industrial Insertion

  • Kaimeng Wang,
  • Yongxiang Fan,
  • Ichiro Sakuma

DOI
https://doi.org/10.3390/s23052514
Journal volume & issue
Vol. 23, no. 5
p. 2514

Abstract

Read online

We propose a novel approach for robotic industrial insertion tasks using the Programming by Demonstration technique. Our method allows robots to learn a high-precision task by observing human demonstration once, without requiring any prior knowledge of the object. We introduce an Imitated-to-Finetuned approach that generates imitated approach trajectories by cloning the human hand’s movements and then fine-tunes the goal position with a visual servoing approach. To identify features on the object used in visual servoing, we model object tracking as the moving object detection problem, separating each demonstration video frame into the moving foreground that includes the object and demonstrator’s hand and the static background. Then a hand keypoints estimation function is used to remove the redundant features on the hand. The experiment shows that the proposed method can make robots learn precision industrial insertion tasks from a single human demonstration.

Keywords