Information (Jul 2022)

A Multi-Sensory Guidance System for the Visually Impaired Using YOLO and ORB-SLAM

  • Zaipeng Xie,
  • Zhaobin Li,
  • Yida Zhang,
  • Jianan Zhang,
  • Fangming Liu,
  • Wei Chen

DOI
https://doi.org/10.3390/info13070343
Journal volume & issue
Vol. 13, no. 7
p. 343

Abstract

Read online

Guidance systems for visually impaired persons have become a popular topic in recent years. Existing guidance systems on the market typically utilize auxiliary tools and methods such as GPS, UWB, or a simple white cane that exploits the user’s single tactile or auditory sense. These guidance methodologies can be inadequate in a complex indoor environment. This paper proposes a multi-sensory guidance system for the visually impaired that can provide tactile and auditory advice using ORB-SLAM and YOLO techniques. Based on an RGB-D camera, the local obstacle avoidance system is realized at the tactile level through point cloud filtering that can inform the user via a vibrating motor. Our proposed method can generate a dense navigation map to implement global obstacle avoidance and path planning for the user through the coordinate transformation. Real-time target detection and a voice-prompt system based on YOLO are also incorporated at the auditory level. We implemented the proposed system as a smart cane. Experiments are performed using four different test scenarios. Experimental results demonstrate that the impediments in the walking path can be reliably located and classified in real-time. Our proposed system can function as a capable auxiliary to help visually impaired people navigate securely by integrating YOLO with ORB-SLAM.

Keywords