IEEE Access (Jan 2023)

Computer Vision-Based Assessment of Autistic Children: Analyzing Interactions, Emotions, Human Pose, and Life Skills

  • Varun Ganjigunte Prakash,
  • Manu Kohli,
  • Swati Kohli,
  • A. P. Prathosh,
  • Tanu Wadhera,
  • Diptanshu Das,
  • Debasis Panigrahi,
  • John Vijay Sagar Kommu

DOI
https://doi.org/10.1109/ACCESS.2023.3269027
Journal volume & issue
Vol. 11
pp. 47907 – 47929

Abstract

Read online

In this paper, the proposed work implements and tests the computer vision applications to perform the skill and emotion assessment of children with Autism Spectrum Disorder (ASD) by extracting various bio-behaviors, human activities, child-therapist interactions, and joint pose estimations from the recorded videos of interactive single- or two-person play-based intervention sessions. A comprehensive data set of 300 videos is amassed from ASD children engaged in social interaction, and three novel deep learning-based vision models are developed, which are explained as follows: (i) activity comprehension to analyze child-play partner interactions (activity comprehension model); (ii) an automatic joint attention recognition framework using head and hand pose; and (iii) emotion and facial expression recognition. The proposed models are also tested on children’s real-world, 68 unseen videos captured from the clinic, and public datasets. The activity comprehension model has an overall accuracy of 72.32%, the joint attention recognition models have an accuracy of 97% for follow eye gaze and 93.4% for hand pointing, and the facial expression recognition model has an overall accuracy of 95.1%. The proposed models could extract behaviors of interest, events of activities, emotions, and social skills from free-play and intervention session videos of long duration and provide temporal plots for session monitoring and assessment, thus empowering clinicians with insightful data useful in diagnosis, assessment, treatment formulation, and monitoring ASD children with limited supervision.

Keywords