Robotics (Oct 2023)

A Novel Actor—Critic Motor Reinforcement Learning for Continuum Soft Robots

  • Luis Pantoja-Garcia,
  • Vicente Parra-Vega,
  • Rodolfo Garcia-Rodriguez,
  • Carlos Ernesto Vázquez-García

DOI
https://doi.org/10.3390/robotics12050141
Journal volume & issue
Vol. 12, no. 5
p. 141

Abstract

Read online

Reinforcement learning (RL) is explored for motor control of a novel pneumatic-driven soft robot modeled after continuum media with a varying density. This model complies with closed-form Lagrangian dynamics, which fulfills the fundamental structural property of passivity, among others. Then, the question arises of how to synthesize a passivity-based RL model to control the unknown continuum soft robot dynamics to exploit its input–output energy properties advantageously throughout a reward-based neural network controller. Thus, we propose a continuous-time Actor–Critic scheme for tracking tasks of the continuum 3D soft robot subject to Lipschitz disturbances. A reward-based temporal difference leads to learning with a novel discontinuous adaptive mechanism of Critic neural weights. Finally, the reward and integral of the Bellman error approximation reinforce the adaptive mechanism of Actor neural weights. Closed-loop stability is guaranteed in the sense of Lyapunov, which leads to local exponential convergence of tracking errors based on integral sliding modes. Notably, it is assumed that dynamics are unknown, yet the control is continuous and robust. A representative simulation study shows the effectiveness of our proposal for tracking tasks.

Keywords