Sensors (Feb 2024)

Dense Out-of-Distribution Detection by Robust Learning on Synthetic Negative Data

  • Matej Grcić,
  • Petra Bevandić,
  • Zoran Kalafatić,
  • Siniša Šegvić

DOI
https://doi.org/10.3390/s24041248
Journal volume & issue
Vol. 24, no. 4
p. 1248

Abstract

Read online

Standard machine learning is unable to accommodate inputs which do not belong to the training distribution. The resulting models often give rise to confident incorrect predictions which may lead to devastating consequences. This problem is especially demanding in the context of dense prediction since input images may be only partially anomalous. Previous work has addressed dense out-of-distribution detection by discriminative training with respect to off-the-shelf negative datasets. However, real negative data may lead to over-optimistic evaluation due to possible overlap with test anomalies. To this end, we extend this approach by generating synthetic negative patches along the border of the inlier manifold. We leverage a jointly trained normalizing flow due to a coverage-oriented learning objective and the capability to generate samples at different resolutions. We detect anomalies according to a principled information-theoretic criterion which can be consistently applied through training and inference. The resulting models set the new state of the art on benchmarks for out-of-distribution detection in road-driving scenes and remote sensing imagery despite minimal computational overhead.

Keywords