Navigation (Jul 2023)
3D Vision Aided GNSS Real-Time Kinematic Positioning for Autonomous Systems in Urban Canyons
Abstract
In this paper, a three-dimensional vision-aided method is proposed to improve global navigation satellite system (GNSS) real-time kinematic (RTK) positioning. To mitigate the impact of reflected non-line-of-sight (NLOS) reception, a sky-pointing camera with a deep neural network was employed to exclude these measurements. However, NLOS exclusion results in distorted satellite geometry. To fill this gap, complementarity between the low-lying visual landmarks and the healthy but high-elevation satellite measurements was explored to improve the geometric constraints. Specifically, inertial measurement units, visual landmarks captured by a forward-looking camera, and healthy GNSS measurements were tightly integrated via sliding window optimization to estimate the GNSS-RTK float solution. The integer ambiguities and the fixed GNSS-RTK solution were then resolved. The effectiveness of the proposed method was verified using several challenging data sets collected in urban canyons in Hong Kong.