IEEE Transactions on Neural Systems and Rehabilitation Engineering (Jan 2022)

StereoPilot: A Wearable Target Location System for Blind and Visually Impaired Using Spatial Audio Rendering

  • Xuhui Hu,
  • Aiguo Song,
  • Zhikai Wei,
  • Hong Zeng

DOI
https://doi.org/10.1109/TNSRE.2022.3182661
Journal volume & issue
Vol. 30
pp. 1621 – 1630

Abstract

Read online

Vision loss severely impacts object recognition and spatial cognition for limited vision individuals. It is a challenge to compensate for this using other sensory modalities, such as touch or hearing. This paper introduces StereoPilot, a wearable target location system to facilitate the spatial cognition of BVI. Through wearing a head-mounted RGB-D camera, the 3D spatial information of the environment is measured and processed into navigation cues. Leveraging spatial audio rendering (SAR) technology, it allows the navigation cues to be transmitted in a type of 3D sound from which the sound orientation can be distinguished by the sound localization instincts in humans. Three haptic and auditory display strategies were compared with SAR through experiments with three BVI and four sighted subjects. Compared with mainstream speech instructional feedback, the experimental results of the Fitts’ law test showed that SAR increases the information transfer rate (ITR) by a factor of three for spatial navigation, while the positioning error is reduced by 40%. Furthermore, SAR has a lower learning effect than other sonification approaches such as vOICe. In desktop manipulation experiments, StereoPilot was able to obtain precise localization of desktop objects while reducing the completion time of target grasping tasks in half as compared to the voice instruction method. In summary, StereoPilot provides an innovative wearable target location solution that swiftly and intuitively transmits environmental information to BVI individuals in the real world.

Keywords