Sensors (Sep 2019)
A Model to Support Fluid Transitions between Environments for Mobile Augmented Reality Applications
Abstract
The adaptability between different environments remains a challenge for Mobile Augmented Reality (MAR). If not done seamlessly, such transitions may cause discontinuities in navigation, consequently disorienting users and undermining the acceptance of this technology. The transition between environments is hard because there are currently no localization techniques that work well in any place: sensor-based applications can be harmed by obstacles that hamper sensor communication (e.g., GPS) and by infrastructure limitations (e.g., Wi-Fi), and image-based applications can be affected by lighting conditions that impair computer vision techniques. Hence, this paper presents an adaptive model to perform transitions between different types of environments for MAR applications. The model has a hybrid approach, choosing the best combination of long-range sensors, short-range sensors, and computer vision techniques to perform fluid transitions between environments that mitigate problems in location, orientation, and registration. To assess the model, we developed a MAR application and conducted a navigation test with volunteers to validate transitions between outdoor and indoor environments, followed by a short interview. The results show that the transitions were well succeeded, since the application self-adapted to the studied environments, seamlessly changing sensors when needed.
Keywords