Virtual Reality & Intelligent Hardware (Aug 2019)

Survey and evaluation of monocular visual-inertial SLAM algorithms for augmented reality

  • Li Jinyu,
  • Yang Bangbang,
  • Chen Danpeng,
  • Wang Nan,
  • Zhang Guofeng,
  • Bao Hujun

Journal volume & issue
Vol. 1, no. 4
pp. 386 – 410

Abstract

Read online

Although VSLAM/VISLAM has achieved great success, it is still difficult to quantitatively evaluate the localization results of different kinds of SLAM systems from the aspect of augmented reality due to the lack of an appropriate benchmark. For AR applications in practice, a variety of challenging situations (e.g., fast motion, strong rotation, serious motion blur, dynamic interference) may be easily encountered since a home user may not carefully move the AR device, and the real environment may be quite complex. In addition, the frequency of camera lost should be minimized and the recovery from the failure status should be fast and accurate for good AR experience. Existing SLAM datasets/benchmarks generally only provide the evaluation of pose accuracy and their camera motions are somehow simple and do not fit well the common cases in the mobile AR applications. With the above motivation, we build a new visual-inertial dataset as well as a series of evaluation criteria for AR. We also review the existing monocular VSLAM/VISLAM approaches with detailed analyses and comparisons. Especially, we select 8 representative monocular VSLAM/VISLAM approaches/systems and quantitatively evaluate them on our benchmark. Our dataset, sample code and corresponding evaluation tools are available at the benchmark website http://www.zjucvg.net/eval-vislam/. Keywords: Visual-inertial SLAM, Odometry, Tracking, Localization, Mapping, Augmented reality