IEEE Access (Jan 2023)

F-ROADNET: Late Fusion-Based Automotive Radar Object Detection

  • Gulbadan Sikander,
  • Shahzad Anwar,
  • Ghassan Husnain,
  • Sangsoon Lim

DOI
https://doi.org/10.1109/ACCESS.2023.3343383
Journal volume & issue
Vol. 11
pp. 142893 – 142902

Abstract

Read online

Road user categorization is essential for autonomous driving perception. In challenging traffic situations including unfavorable weather (such as fog, snow, and rain) and dim lighting. There are several kinds of sensors that need to be researched in order to achieve the precision and resilience that autonomous systems demand. Currently, to create a depiction of the environment surrounding the vehicle, principally cameras and laser scanners (LiDAR) are commissioned. Despite their enticing qualities, Radar sensors are currently underutilized for autonomous driving, even though, they have been employed in the automobile industry for a long time. Radar’s ability to measure the relative speed of obstacles and to operate even in adverse weather conditions makes it a front line contender for road user detection. This study proposes F-ROADNET, a multi-object classification method for vulnerable road users based on raw Radar data. The model is trained on Range Angle and Range Doppler maps based on a late fusion architecture. F-ROADNET has a detection accuracy of 99.01%, precision of 99.3% and recall of 99% on the CARRADA dataset and detection accuracy of 91.62%, precision of 87.2% and recall of 90.2% on the RADDet dataset. The findings exhibit that F-ROADNET outperforms established methods in terms of average precision.

Keywords