IEEE Access (Jan 2024)

Analysis of the Impact of Lens Blur on Safety-Critical Automotive Object Detection

  • Dara Molloy,
  • Patrick Muller,
  • Brian Deegan,
  • Darragh Mullins,
  • Jonathan Horgan,
  • Enda Ward,
  • Edward Jones,
  • Alexander Braun,
  • Martin Glavin

DOI
https://doi.org/10.1109/ACCESS.2023.3348663
Journal volume & issue
Vol. 12
pp. 3554 – 3569

Abstract

Read online

Camera-based object detection is widely used in safety-critical applications such as advanced driver assistance systems (ADAS) and autonomous vehicle research. Road infrastructure has been designed for human vision, so computer vision, with RGB cameras, is a vital source of semantic information from the environment. Sensors, such as LIDAR and RADAR, are also often utilized for these applications; however, cameras provide a higher spatial resolution and color information. The spatial frequency response (SFR), or sharpness of a camera, utilized in object detection systems must be sufficient to allow a detection algorithm to localize objects in the environment over its lifetime reliably. This study explores the relationship between object detection performance and SFR. Six state-of-the-art object detection models are evaluated with varying levels of lens defocus. A novel raw image dataset is created and utilized, containing pedestrians and cars over a range of distances up to 100-m from the sensor. Object detection performance for each defocused dataset is analyzed over a range of distances to determine the minimum SFR necessary in each case. Results show that the relationship between object detection performance and lens blur is much more complex than previous studies have found due to lens field curvature, chromatic aberration, and astigmatisms. We have found that smaller objects are disproportionately impacted by lens blur, and different object detection models have differing levels of robustness to lens blur.

Keywords