Frontiers in Computational Neuroscience (Feb 2015)

Decoding of human hand actions to handle missing limbs in Neuroprosthetics

  • Jovana eBelic,
  • Jovana eBelic,
  • Jovana eBelic,
  • Aldo A. Faisal,
  • Aldo A. Faisal

Journal volume & issue
Vol. 9


Read online

The only way we can interact with the world is through movements, and our primary interactions are via the hands, thus any loss of hand function has immediate impact on our quality of life. However, to date it has not been systematically assessed how coordination in the hand's joints affects every day actions. This is important for two fundamental reasons. Firstly, to understand the representations and computations underlying motor control in-the-wild situations, and secondly to develop smarter controllers for prosthetic hands that have the same functionality as natural limbs. In this work we exploit the correlation structure of our hand and finger movements in daily-life. The novelty of our idea is that instead of averaging variability out, we take the view that the structure of variability may contain valuable information about the task being performed. We asked seven subjects to interact in 17 daily-life situations, and quantified behaviour in a principled manner using CyberGlove body sensor networks that, after accurate calibration, track all major joints of the hand. Our key findings are: 1. We confirmed that hand control in daily-life tasks is very low-dimensional, with four to five dimensions being sufficient to explain 80-90% of the variability in the natural movement data. 2. We established a universally applicable measure of manipulative complexity that allowed us to measure and compare limb movements across tasks. We used Bayesian latent variable models to model the low-dimensional structure of finger joint angles in natural actions. 3. This allowed us to build a naïve classifier that within the first 1000ms of action initiation (from a flat hand start configuration) predicted which of the 17 actions was going to be executed - enabling us to reliably predict the action intention from very short-time-scale initial data, further revealing the foreseeable nature of hand movements for control of neuroprosthetics and tele operation purposes. 4. Using the Exp