i-Perception (May 2012)

Sensory Feedback and Sensorimotor Adaptation in Human-Computer Interface for a Gesture-Based Contactless Musical Instrument

  • Adar Pelah,
  • Philip Greenhalgh

DOI
https://doi.org/10.1068/id253
Journal volume & issue
Vol. 3

Abstract

Read online

A study is presented of a human-computer interface (HCI) for an expressive contactless musical instrument (using a ToF depth camera) that considers sensory feedback and sensorimotor adaptation in comparison with a conventional contact instrument. The design uses an intuitive ‘drum membrane’ paradigm for striking musical notes using simple pressing hand gestures on a notional keyboard in free air. In Experiment 1, 5 subjects were asked to complete a range of musical tasks using two forms of sensory feedback: Auditory-Only, where subjects could only hear the consequences of their pressing gestures, and Visual+Auditory, where subjects could both hear the sounds and receive visual feedback on a computer display. Results showed that Auditory+Visual feedback produced more precise performance (SD = 0.894) in comparison to Auditory-Only feedback (SD = 3.507), supporting the importance of a visual feedback element as an aid to natural gesture-based control in HCI. In Experiment 2, a comparison was made between sensorimotor adaptation in the contactless instrument (Visual-Only) and a conventional contact (Visual+Haptic) keyboard instrument. For each instrument, 7 subjects were asked to maintain tones at a perceived constant level (baseline) whilst a parameter (gain) was altered and then later restored. Once restored, the number of presses required to return to baseline was quantified as the after-effect of the adaptation. Results indicated that while design requirements for a contactless instrument may be very different from one that includes physical contact, similar neural mechanisms mediate a user's dynamic adaptation to both types of instrument.