Remote Sensing (Aug 2023)

Vehicle Localization in a Completed City-Scale 3D Scene Using Aerial Images and an On-Board Stereo Camera

  • Haihan Zhang,
  • Chun Xie,
  • Hisatoshi Toriya,
  • Hidehiko Shishido,
  • Itaru Kitahara

DOI
https://doi.org/10.3390/rs15153871
Journal volume & issue
Vol. 15, no. 15
p. 3871

Abstract

Read online

Simultaneous Localization and Mapping (SLAM) forms the foundation of vehicle localization in autonomous driving. Utilizing high-precision 3D scene maps as prior information in vehicle localization greatly assists in the navigation of autonomous vehicles within large-scale 3D scene models. However, generating high-precision maps is complex and costly, posing challenges to commercialization. As a result, a global localization system that employs low-precision, city-scale 3D scene maps reconstructed by unmanned aerial vehicles (UAVs) is proposed to optimize visual positioning for vehicles. To address the discrepancies in image information caused by differing aerial and ground perspectives, this paper introduces a wall complementarity algorithm based on the geometric structure of buildings to refine the city-scale 3D scene. A 3D-to-3D feature registration algorithm is developed to determine vehicle location by integrating the optimized city-scale 3D scene with the local scene generated by an onboard stereo camera. Through simulation experiments conducted in a computer graphics (CG) simulator, the results indicate that utilizing a completed low-precision scene model enables achieving a vehicle localization accuracy with an average error of 3.91 m, which is close to the 3.27 m error obtained using the high-precision map. This validates the effectiveness of the proposed algorithm. The system demonstrates the feasibility of utilizing low-precision city-scale 3D scene maps generated by unmanned aerial vehicles (UAVs) for vehicle localization in large-scale scenes.

Keywords