Computers & Education: X Reality (Jan 2024)

Applying multimodal data fusion to track autistic adolescents’ representational flexibility development during virtual reality-based training

  • Jewoong Moon,
  • Fengfeng Ke,
  • Zlatko Sokolikj,
  • Shayok Chakraborty

Journal volume & issue
Vol. 4
p. 100063

Abstract

Read online

In our study, we harnessed multimodal data to develop a predictive model aimed at assessing the development of representational flexibility (RF) in autistic adolescents engaged in virtual reality (VR)-based cognitive skills training. Recognizing VR's potential to enhance RF through immersive 3D simulation tasks, we addressed the research gap in analyzing learners' digital interactions within this environment. This data mining study integrated diverse data sources—including behavioral cues, physiological responses, and direct interaction logs—collected from 178 training sessions with eight autistic adolescents. This comprehensive dataset, encompassing both audio and screen recordings, was analyzed using advanced machine learning techniques. Through decision-level data fusion, particularly employing the random forest algorithm, our model demonstrated enhanced accuracy in predicting RF development, surpassing single-source data approaches. This research not only contributes to the effective use of VR in educational interventions for autistic adolescents but also showcases the potential of multimodal data fusion in understanding complex cognitive skills development.

Keywords