Sensors (Feb 2023)

Vehicle Detection on Occupancy Grid Maps: Comparison of Five Detectors Regarding Real-Time Performance

  • Nils Defauw,
  • Marielle Malfante,
  • Olivier Antoni,
  • Tiana Rakotovao,
  • Suzanne Lesecq

DOI
https://doi.org/10.3390/s23031613
Journal volume & issue
Vol. 23, no. 3
p. 1613

Abstract

Read online

Occupancy grid maps are widely used as an environment model that allows the fusion of different range sensor technologies in real-time for robotics applications. In an autonomous vehicle setting, occupancy grid maps are especially useful for their ability to accurately represent the position of surrounding obstacles while being robust to discrepancies between the fused sensors through the use of occupancy probabilities representing uncertainty. In this article, we propose to evaluate the applicability of real-time vehicle detection on occupancy grid maps. State of the art detectors in sensor-specific domains such as YOLOv2/YOLOv3 for images or PIXOR for LiDAR point clouds are modified to use occupancy grid maps as input and produce oriented bounding boxes enclosing vehicles as output. The five proposed detectors are trained on the Waymo Open automotive dataset and compared regarding the quality of their detections measured in terms of Average Precision (AP) and their real-time capabilities measured in Frames Per Second (FPS). Of the five detectors presented, one inspired from the PIXOR backbone reaches the highest AP0.7 of 0.82 and runs at 20 FPS. Comparatively, two other proposed detectors inspired from YOLOv2 achieve an almost as good, with a AP0.7 of 0.79 while running at 91 FPS. These results validate the feasibility of real-time vehicle detection on occupancy grids.

Keywords