Applied Sciences (May 2024)

Indoor AR Navigation Framework Based on Geofencing and Image-Tracking with Accumulated Error Correction

  • Min Lu,
  • Masatoshi Arikawa,
  • Kohei Oba,
  • Keiichi Ishikawa,
  • Yuhan Jin,
  • Tomihiro Utsumi,
  • Ryo Sato

DOI
https://doi.org/10.3390/app14104262
Journal volume & issue
Vol. 14, no. 10
p. 4262

Abstract

Read online

This study presents a novel framework for improving indoor augmented reality (AR) navigation with modern smartphone technology, which is achieved by addressing two major challenges: managing large absolute coordinate spaces and reducing error accumulation in camera-based spatial tracking. Our contribution is significant in two ways. First, we integrate geofencing with indoor navigation by considering spatial tracking errors, timing for audio guidance, and dynamic 3D arrow visualization for effective local-to-global spatial coordinate transformation. This method achieves precise local positioning and seamlessly integrates with larger spatial contexts, overcoming the limitations of current AR systems. Second, we introduce a periodic image-based calibration approach to minimize the inherent error accumulation in camera-based tracking, enhancing accuracy over longer distances. Unlike prior studies focusing on individual technologies, our work explores the software architecture of indoor AR navigation by providing a comprehensive framework for its design and practical use. The practicality of our approach is validated through the implementation of a smartphone application at the Mineral Industry Museum of Akita University, highlighting the limitations of component technologies and demonstrating our framework’s effectiveness.

Keywords