IEEE Access (Jan 2019)

SFA: Small Faces Attention Face Detector

  • Shi Luo,
  • Xiongfei Li,
  • Rui Zhu,
  • Xiaoli Zhang

DOI
https://doi.org/10.1109/ACCESS.2019.2955757
Journal volume & issue
Vol. 7
pp. 171609 – 171620

Abstract

Read online

Tremendous strides have been made in face detection thanks to convolutional neural network. However, the performance of previous face detectors deteriorates dramatically as the face scale shrinks. In this paper, we propose a novel scale-invariant face detector, named Small Faces Attention (SFA) face detector, for better detecting small faces. We first present multi-branch face detection architecture which pays more attention to faces with small scale. Then, feature maps of neighbouring branches is fused so that the features coming from large scale can auxiliary detect hard faces with small scale. Finally, we simultaneously adopt multi-scale training and testing to make our model robust towards various scale. Comprehensive experiments show that SFA significantly improves face detection performance, especially on small faces. Our method achieves promising detection performance on challenging face detection benchmarks, including WIDER FACE and FDDB datasets, with competitive runtime speed. Both our code and model will be available at https://github.com/shiluo1990/SFA.

Keywords