i-Perception (May 2012)

Motor Training: Comparison of Visual and Auditory Coded Proprioceptive Cues

  • Philip Jepson,
  • Adar Pelah

DOI
https://doi.org/10.1068/id249
Journal volume & issue
Vol. 3

Abstract

Read online

Self-perception of body posture and movement is achieved through multi-sensory integration, particularly the utilisation of vision, and proprioceptive information derived from muscles and joints. Disruption to these processes can occur following a neurological accident, such as stroke, leading to sensory and physical impairment. Rehabilitation can be helped through use of augmented visual and auditory biofeedback to stimulate neuro-plasticity, but the effective design and application of feedback, particularly in the auditory domain, is non-trivial. Simple auditory feedback was tested by comparing the stepping accuracy of normal subjects when given a visual spatial target (step length) and an auditory temporal target (step duration). A baseline measurement of step length and duration was taken using optical motion capture. Subjects (n=20) took 20 ‘training’ steps (baseline ±25%) using either an auditory target (950 Hz tone, bell-shaped gain envelope) or visual target (spot marked on the floor) and were then asked to replicate the target step (length or duration corresponding to training) with all feedback removed. Visual cues (mean percentage error=11.5%; SD ± 7.0%); auditory cues (mean percentage error = 12.9%; SD ± 11.8%). Visual cues elicit a high degree of accuracy both in training and follow-up un-cued tasks; despite the novelty of the auditory cues present for subjects, the mean accuracy of subjects approached that for visual cues, and initial results suggest a limited amount of practice using auditory cues can improve performance.