Applied Sciences (May 2018)

Design of Demonstration-Driven Assembling Manipulator

  • Qianxiao Wei,
  • Canjun Yang,
  • Wu Fan,
  • Yibing Zhao

DOI
https://doi.org/10.3390/app8050797
Journal volume & issue
Vol. 8, no. 5
p. 797

Abstract

Read online

Currently, a mechanical arm or manipulator needs to be programmed by humans in advance to define its motion trajectory before practical use. However, the programming is tedious and high-cost, which renders such manipulators unable to perform various different tasks easily and quickly. This article focuses on the design of a vision-guided manipulator without explicit human programming. The proposed demonstration-driven system mainly consists of a manipulator, control box, and camera. Instead of programming of the detailed motion trajectory, one only needs to show the system how to perform a given task manually. Based on internal object recognition and motion detection algorithms, the camera can capture the information of the task to be performed and generate the motion trajectories for the manipulator to make it copy the human demonstration. The movement of the joints of the manipulator is given by a trajectory planner in the control box. Experimental results show that the system can imitate humans easily, quickly, and accurately for common tasks such as sorting and assembling objects. Teaching the manipulator how to complete the desired motion can help eliminate the complexity of programming for motion control.

Keywords