IEEE Access (Jan 2021)

Recognition of Students’ Mental States in Discussion Based on Multimodal Data and its Application to Educational Support

  • Shimeng Peng,
  • Katashi Nagao

DOI
https://doi.org/10.1109/ACCESS.2021.3054176
Journal volume & issue
Vol. 9
pp. 18235 – 18250

Abstract

Read online

Students will experience a complex mixture of mental states during discussion, including concentration, confusion, frustration, and boredom, which have been widely acknowledged as crucial components for revealing a student’s learning states. In this study, we propose using multimodal data to design an intelligent monitoring agent that can assist teachers in effectively monitoring the multiple mental states of students during discussion. We firstly developed an advanced multi-sensor-based system and applied it in a real university’s research lab to collect a multimodal “in-the-wild” teacher-student conversation dataset. Then, we derived a set of proxy features from facial, heart rate, and acoustic modalities and used them to train several supervised learning classifiers with different multimodal fusion approaches single-channel-level, feature-level, and decision-level fusion to recognize students’ multiple mental states in conversations. We explored how to design multimodal analytics to augment the ability to recognize different mental states and found that fusing heart rate and acoustic modalities yields better recognize the states of concentration (AUC = 0.842) and confusion (AUC = 0.695), while fusing three modalities yield the best performance in recognizing the states of frustration (AUC = 0.737) and boredom (AUC = 0.810). Our results also explored the possibility of leveraging the advantages of the replacement capabilities between different modalities to provide human teachers with solutions for addressing the challenges with monitoring students in different real-world education environments.

Keywords