IEEE Transactions on Neural Systems and Rehabilitation Engineering (Jan 2022)

Improving Automatic Control of Upper-Limb Prosthesis Wrists Using Gaze-Centered Eye Tracking and Deep Learning

  • Maxim Karrenbach,
  • David Boe,
  • Astrini Sie,
  • Rob Bennett,
  • Eric Rombokas

DOI
https://doi.org/10.1109/TNSRE.2022.3147772
Journal volume & issue
Vol. 30
pp. 340 – 349

Abstract

Read online

Many upper-limb prostheses lack proper wrist rotation functionality, leading to users performing poor compensatory strategies, leading to overuse or abandonment. In this study, we investigate the validity of creating and implementing a data-driven predictive control strategy in object grasping tasks performed in virtual reality. We propose the idea of using gaze-centered vision to predict the wrist rotations of a user and implement a user study to investigate the impact of using this predictive control. We demonstrate that using this vision-based predictive system leads to a decrease in compensatory movement in the shoulder, as well as task completion time. We discuss the cases in which the virtual prosthesis with the predictive model implemented did and did not make a physical improvement in various arm movements. We also discuss the cognitive value in implementing such predictive control strategies into prosthetic controllers. We find that gaze-centered vision provides information about the intent of the user when performing object reaching and that the performance of prosthetic hands improves greatly when wrist prediction is implemented. Lastly, we address the limitations of this study in the context of both the study itself as well as any future physical implementations.

Keywords