The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences (Sep 2018)

LOCALIZATION OF A CAR BASED ON MULTI-SENSOR FUSION

  • H. Kim,
  • I. Lee

DOI
https://doi.org/10.5194/isprs-archives-XLII-1-247-2018
Journal volume & issue
Vol. XLII-1
pp. 247 – 250

Abstract

Read online

The vehicle localization is an essential component for stable autonomous car operation. There are many algorithms for the vehicle localization. However, it still needs much improvement in terms of its accuracy and cost. In this paper, sensor fusion based localization algorithm is used for solving this problem. Our sensor system is composed of in-vehicle sensors, GPS and vision sensors. The localization algorithm is based on extended Kalman filter and it has time update step and measurement update step. In the time update step, in-vehicle sensors are used such as yaw-rate and speed sensor. And GPS and vision sensor information are used to update the vehicle position in the measurement update step.We use visual odometry library to process vision sensor data and generate the moving distance and direction of the car. Especially, when performing visual odometry we use georeferenced image database to reduce the error accumulation. Through the experiments, the proposed localization algorithm is verified and evaluated. The RMS errors of the estimated result from the proposed algorithm are about 4.3 m. This result shows about 40 % improvement in accuracy even in comparison with the result from the GPS only method. It shows the possibility to use proposed localization algorithm. However, it is still necessary to improve the accuracy for applying this algorithm to the autonomous car. Therefore, we plan to use multiple cameras (rear cameras or AVM cameras) and more information such as high-definition map or V2X communication. And the filter and error modelling also need to be changed for the better results.