IEEE Access (Jan 2019)

Learning Articulated Constraints From a One-Shot Demonstration for Robot Manipulation Planning

  • Yizhou Liu,
  • Fusheng Zha,
  • Lining Sun,
  • Jingxuan Li,
  • Mantian Li,
  • Xin Wang

DOI
https://doi.org/10.1109/ACCESS.2019.2953894
Journal volume & issue
Vol. 7
pp. 172584 – 172596

Abstract

Read online

Robots manipulating in domestic environments generally need to interact with articulated objects, such as doors, drawers, laptops and swivel chairs. The rigid bodies that make up these objects are connected by a revolute pair or a prismatic pair. Robots are expected to learn and understand the objects' articulated constraints with a simple interaction method. In this way, the autonomy of robot manipulation will be greatly improved in an environment with unstructured constraints. In this paper, a method is proposed to obtain the articulated objects' constraint model by learning from a one-shot continuous visual demonstration which contains multistep movements, and this enables human teacher to continuously demonstrate several tasks at once without manual segmentation. At the end of this paper, a six-degree-of-freedom robot uses the constraint model obtained by demonstration learning to achieve manipulation planning of various tasks based on the AG-CBiRRT algorithm.

Keywords