Revista Iberoamericana de Automática e Informática Industrial RIAI (Jan 2020)
Object classification using bimodal perception data extracted from single-touch robotic grasps
Abstract
This work presents a method to classify grasped objects with a multi-fingered robotic hand combining proprioceptive and tactile data in a hybrid descriptor. The proprioceptive data are obtained from the joint positions of the hand and the tactile data are obtained from the contact registered by pressure cells installed on the phalanges. The proposed approach allows us to identify the grasped object by learning the contact geometry and stiness from the readings by sensors. In this work, we show that using bimodal data of different nature along with supervised learning techniques improves the recognition rate. In experimentation, more than 3000 grasps of up to 7 dierent domestic objects have been carried out, obtaining an average F1 score around 95 %, performing just a single grasp. In addition, the generalization of the method has been verified by training our system with certain objects and classifying new, similar ones without any prior knowledge.
Keywords