IEEE Access (Jan 2016)

Hybrid Control of a Vision-Guided Robot Arm by EOG, EMG, EEG Biosignals and Head Movement Acquired via a Consumer-Grade Wearable Device

  • Ludovico Minati,
  • Natsue Yoshimura,
  • Yasuharu Koike

DOI
https://doi.org/10.1109/ACCESS.2017.2647851
Journal volume & issue
Vol. 4
pp. 9528 – 9541

Abstract

Read online

Simultaneous acquisition of electrooculogram, jaw electromyogram, electroencephalogram, and head movement via consumer-grade wearable devices has become possible. Such devices offer new opportunities to deploy practical biosignal-based interfaces for assistive robots; however, they also pose challenges related to the available signals and their characteristics. In this proof-of-concept study, we demonstrate the possibility of successful control of a 5 + 1 degrees-of-freedom robot arm based on a consumer wireless headband in the form of four control modes predicated on distinct signal combinations. We propose a control approach hybrid at two levels, which seeks a compromise between robot controllability and maintaining the user goal rather than being process-focused. First, robot arm steering combines discrete and proportional aspects. Second, after the robot has been steered toward the approximate target direction, a sparse approach is followed and the user only needs to issue a single command, after which steering adjustment and grasping are performed automatically under stereoscopic vision guidance. We present in detail the associated algorithms, whose implementation is publicly available. Within this framework, we also demonstrate the control of arm posture and grasping force based, respectively, on object visual features and user input. We regard the interface proposed herein as a viable blueprint for future work on controlling wheelchair-mounted and meal-assisting robot arms.

Keywords