IEEE Access (Jan 2020)

Sight-to-Sound Human-Machine Interface for Guiding and Navigating Visually Impaired People

  • Guojun Yang,
  • Jafar Saniie

DOI
https://doi.org/10.1109/ACCESS.2020.3029426
Journal volume & issue
Vol. 8
pp. 185416 – 185428

Abstract

Read online

Visually impaired people often find it hard to navigate efficiently in complex environments. Moreover, helping them to navigate intuitively is not a trivial task. In sighted people, cognitive maps derived from visual cues play a pivotal role in navigation. In this paper, we present a sight-to-sound human-machine interface (STS-HMI), a novel machine vision guidance system that enables visually impaired people to navigate with instantaneous and intuitive responses. The proposed STS-HMI system extracts visual context from scenes and converts them into binaural acoustic cues for users to establish cognitive maps. A series of experiments were conducted to evaluate the performance of the STS-HMI system in a complex environment with difficult navigation paths. The experimental results confirm that the STS-HMI system improves visually impaired people's mobility with minimal effort.

Keywords