IEEE Access (Jan 2024)

Accurate Neonatal Face Detection for Improved Pain Classification in the Challenging NICU Setting

  • Jacqueline Hausmann,
  • Md Sirajus Salekin,
  • Ghada Zamzmi,
  • Peter R. Mouton,
  • Stephanie Prescott,
  • Thao Ho,
  • Yu Sun,
  • Dmitry Goldgof

DOI
https://doi.org/10.1109/ACCESS.2024.3383789
Journal volume & issue
Vol. 12
pp. 49122 – 49133

Abstract

Read online

There is a tendency for object detection systems using off-the-shelf algorithms to fail when deployed in complex scenes. The present work describes a case for detecting facial expression in post-surgical neonates (newborns) as a modality for predicting and classifying severe pain in the Neonatal Intensive Care Unit (NICU). Our initial testing showed that both an off-the-shelf face detector and a machine learning algorithm trained on adult faces failed to detect facial expression of neonates in the NICU. We improved accuracy in this complex scene by training a state-of-the-art “You-Only-Look-Once” (YOLO) face detection model using the USF-MNPAD-I dataset of neonate faces. At run-time our trained YOLO model showed a difference of 8.6% mean Average Precision (mAP) and 21.2% Area under the ROC Curve (AUC) for automatic classification of neonatal pain compared with manual pain scoring by NICU nurses. Given the challenges, time and effort associated with collecting ground truth from the faces of post-surgical neonates, here we share the weights from training our YOLO model with these facial expression data. These weights can facilitate the further development of accurate strategies for detecting facial expression, which can be used to predict the time to pain onset in combination with other sensory modalities (body movements, crying frequency, vital signs). Reliable predictions of time to pain onset in turn create a therapeutic window of time wherein NICU nurses and providers can implement safe and effective strategies to mitigate severe pain in this vulnerable patient population.

Keywords