IEEE Access (Jan 2019)

Tightly-Coupled Monocular Visual-Odometric SLAM Using Wheels and a MEMS Gyroscope

  • Meixiang Quan,
  • Songhao Piao,
  • Minglang Tan,
  • Shi-Sheng Huang

DOI
https://doi.org/10.1109/ACCESS.2019.2930201
Journal volume & issue
Vol. 7
pp. 97374 – 97389

Abstract

Read online

In this paper, we present a novel tightly coupled probabilistic monocular visual-odometric simultaneous localization and mapping (VOSLAM) algorithm using wheels and a MEMS gyroscope, which can provide accurate, robust, and long-term localization for ground robots. First, we present a novel odometer preintegration theory on manifold; it integrates the wheel encoder measurements and gyroscope measurements to a relative motion constraint that is independent of the linearization point and carefully addresses the uncertainty propagation and gyroscope bias correction. Based on the preintegrated odometer measurement model, we also introduce the odometer error term and tightly integrate it into the visual optimization framework. Then, in order to bootstrap the VOSLAM system, we propose a simple map initialization method. Finally, we present a complete localization mechanism to maximally exploit both sensing cues, which provides different strategies for motion tracking when: 1) both measurements are available; 2) visual measurements are not available; and 3) wheel encoders experience slippage, thereby ensuring the accurate and robust motion tracking. The proposed algorithm is evaluated by performing extensive experiments, and the experimental results demonstrate the superiority of the proposed system.

Keywords