JMIR mHealth and uHealth (Jan 2021)

Utilization of Smartphone Depth Mapping Cameras for App-Based Grading of Facial Movement Disorders: Development and Feasibility Study

  • Taeger, Johannes,
  • Bischoff, Stefanie,
  • Hagen, Rudolf,
  • Rak, Kristen

DOI
https://doi.org/10.2196/19346
Journal volume & issue
Vol. 9, no. 1
p. e19346

Abstract

Read online

BackgroundFor the classification of facial paresis, various systems of description and evaluation in the form of clinician-graded or software-based scoring systems are available. They serve the purpose of scientific and clinical assessment of the spontaneous course of the disease or monitoring therapeutic interventions. Nevertheless, none have been able to achieve universal acceptance in everyday clinical practice. Hence, a quick and precise tool for assessing the functional status of the facial nerve would be desirable. In this context, the possibilities that the TrueDepth camera of recent iPhone models offer have sparked our interest. ObjectiveThis paper describes the utilization of the iPhone’s TrueDepth camera via a specially developed app prototype for quick, objective, and reproducible quantification of facial asymmetries. MethodsAfter conceptual and user interface design, a native app prototype for iOS was programmed that accesses and processes the data of the TrueDepth camera. Using a special algorithm, a new index for the grading of unilateral facial paresis ranging from 0% to 100% was developed. The algorithm was adapted to the well-established Stennert index by weighting the individual facial regions based on functional and cosmetic aspects. Test measurements with healthy subjects using the app were performed in order to prove the reliability of the system. ResultsAfter the development process, the app prototype had no runtime or buildtime errors and also worked under suboptimal conditions such as different measurement angles, so it met our criteria for a safe and reliable app. The newly defined index expresses the result of the measurements as a generally understandable percentage value for each half of the face. The measurements that correctly rated the facial expressions of healthy individuals as symmetrical in all cases were reproducible and showed no statistically significant intertest variability. ConclusionsBased on the experience with the app prototype assessing healthy subjects, the use of the TrueDepth camera should have considerable potential for app-based grading of facial movement disorders. The app and its algorithm, which is based on theoretical considerations, should be evaluated in a prospective clinical study and correlated with common facial scores.