Machines (Oct 2023)

Grasping Pose Estimation for Robots Based on Convolutional Neural Networks

  • Tianjiao Zheng,
  • Chengzhi Wang,
  • Yanduo Wan,
  • Sikai Zhao,
  • Jie Zhao,
  • Debin Shan,
  • Yanhe Zhu

DOI
https://doi.org/10.3390/machines11100974
Journal volume & issue
Vol. 11, no. 10
p. 974

Abstract

Read online

Robots gradually have the ability to plan grasping actions in unknown scenes by learning the manipulation of typical scenes. The grasping pose estimation method, as a kind of end-to-end method, has rapidly developed in recent years because of its good generalization. In this paper, we present a grasping pose estimation method for robots based on convolutional neural networks. In this method, a convolutional neural network model was employed, which can output the grasping success rate, approach angle, and gripper opening width for the input voxel. The grasping dataset was produced, and the model was trained in the physical simulator. A position optimization of the robotic grasping was proposed according to the distribution of the object centroid to improve the grasping success rate. An experimental platform for robot grasping was established, and 11 common everyday objects were selected for the experiments. Grasping experiments involving the eleven objects individually, multiple objects, as well as a dark environment without illumination, were performed. The results show that the method has the adaptability to grasp different geometric objects, including irregular shapes, and it is not influenced by lighting conditions. The total grasping success rate was 88.2% for the individual objects and 81.1% for the cluttered scene.

Keywords