Sensors (Aug 2023)

Depth-Dependent Control in Vision-Sensor Space for Reconfigurable Parallel Manipulators

  • Arturo Franco-López,
  • Mauro Maya,
  • Alejandro González,
  • Antonio Cardenas,
  • Davide Piovesan

DOI
https://doi.org/10.3390/s23167039
Journal volume & issue
Vol. 23, no. 16
p. 7039

Abstract

Read online

In this paper, a control approach for reconfigurable parallel robots is designed. Based on it, controls in the vision-sensor, 3D and joint spaces are designed and implemented in target tracking tasks in a novel reconfigurable delta-type parallel robot. No a priori information about the target trajectory is required. Robot reconfiguration can be used to overcome some of the limitations of parallel robots like small relative workspace or multiple singularities, at the cost of increasing the complexity of the manipulator, making its control design even more challenging. No general control methodology exists for reconfigurable parallel robots. Tracking objects with unknown trajectories is a challenging task required in many applications. Sensor-based robot control has been actively used for this type of task. However, it cannot be straightforwardly extended to reconfigurable parallel manipulators. The developed vision-sensor space control is inspired by, and can be seen as an extension of, the Velocity Linear Camera Model–Camera Space Manipulation (VLCM-CSM) methodology. Several experiments were carried out on a reconfigurable delta-type parallel robot. An average positioning error of 0.6 mm was obtained for static objectives. Tracking errors of 2.5 mm, 3.9 mm and 11.5 mm were obtained for targets moving along a linear trajectory at speeds of 6.5, 9.3 and 12.7 cm/s, respectively. The control cycle time was 16 ms. These results validate the proposed approach and improve upon previous works for non-reconfigurable robots.

Keywords