IEEE Access (Jan 2024)

4D Radar-Camera Sensor Fusion for Robust Vehicle Pose Estimation in Foggy Environments

  • Seunghoon Yang,
  • Minseong Choi,
  • Seungho Han,
  • Keun-Ha Choi,
  • Kyung-Soo Kim

DOI
https://doi.org/10.1109/ACCESS.2023.3345375
Journal volume & issue
Vol. 12
pp. 16178 – 16188

Abstract

Read online

The integration of cameras and millimeter-wave radar into sensor fusion algorithms is essential to ensure robustness and cost effectiveness for vehicle pose estimation. Due to the low resolution of traditional radar, several studies have investigated 4D imaging radar, which provides range, Doppler, azimuth, and elevation information with high resolution. This paper presents a method for robustly estimating vehicle pose through 4D radar and camera fusion, utilizing the complementary characteristics of each sensor. Leveraging the single-view geometry of the detected vehicle bounding box, the lateral position is derived based on the camera images, and the yaw rate is calculated through feature matching between consecutive images. The high-resolution 4D radar data are used to estimate the heading angle and forward velocity of the target vehicle by leveraging the position and Doppler velocity information. Finally, an extended Kalman filter (EKF) is employed to fuse the physical quantities obtained by each sensor, resulting in more robust pose estimation results. To validate the performance of the proposed method, experiments were conducted in foggy environments, including straight and curved driving scenarios. The experimental results indicate that the performance of the camera-based method is reduced due to frame loss in visually challenging scenarios such as foggy environments, whereas the proposed method exhibits superior performance and enhanced robustness.

Keywords