PLoS ONE (Jan 2025)

SAMF-YOLO: A self-supervised, high-precision approach for defect detection in complex industrial environments.

  • Jun Huang,
  • Shamsul Arrieya Ariffin,
  • Qiang Zhu,
  • Wanting Xu,
  • Qun Yang

DOI
https://doi.org/10.1371/journal.pone.0327001
Journal volume & issue
Vol. 20, no. 7
p. e0327001

Abstract

Read online

As object detection models grow in complexity, balancing computational efficiency and feature expressiveness becomes a critical challenge. To address this, we propose SAMF-YOLO, a novel model integrating three key components: SONet, BFAM, and FASFF-Head. The UniRepLKNet backbone, enhanced by the Star Operation, expands the feature space with high efficiency. FASFF-Head performs adaptive multi-scale feature fusion with minimal overhead, and the Bi-temporal Feature Aggregation Module (BFAM) strengthens the detection of small defects. Additionally, the Focaler-IoU loss improves bounding box regression for challenging object scales, and a self-supervised contrastive learning strategy enhances feature representation and model robustness without relying on labeled data. Experimental results demonstrate that SAMF-YOLO surpasses YOLOv11s with a 6.38% improvement in [email protected] and a notable reduction in computational cost, confirming its superiority in accuracy, efficiency, and robustness. The code is released at https://github.com/Missing24ff/SAMF-YOLO.git.