Computers and Education: Artificial Intelligence (Dec 2024)

Teachers' and students' perceptions of AI-generated concept explanations: Implications for integrating generative AI in computer science education

  • Soohwan Lee,
  • Ki-Sang Song

Journal volume & issue
Vol. 7
p. 100283

Abstract

Read online

The educational application of Generative AI (GAI) has garnered significant interest, sparking discussions about the pedagogical value of GAI-generated content. This study investigates the perceived effectiveness of concept explanations produced by GAI compared to those created by human teachers, focusing on programming concepts of sequence, selection, and iteration. The research also explores teachers' and students' ability to discern the source of these explanations. Participants included 11 teachers and 70 sixth-grade students who were presented with concept explanations created or generated by teachers and ChatGPT. They were asked to evaluate the helpfulness of the explanations and identify their source. Results indicated that teachers found GAI-generated explanations more helpful for sequence and selection concepts, while preferring teacher-created explanations for iteration (χ2(2, N = 11) = 10.062, p = .007, ω = .595). In contrast, students showed varying abilities to distinguish between AI-generated and teacher-created explanations across concepts, with significant differences observed (χ2(2, N = 70) = 22.127, p < .001, ω = .399). Notably, students demonstrated difficulty in identifying the source of explanations for the iteration concept (χ2(1, N = 70) = 8.45, p = .004, φ = .348). Qualitative analysis of open-ended responses revealed that teachers and students employed similar criteria for evaluating explanations but differed in their ability to discern the source. Teachers focused on pedagogical effectiveness, while students prioritized relatability and clarity. The findings highlight the importance of considering both teachers' and students' perspectives when integrating GAI into computer science education. The study proposes strategies for designing GAI-based explanations that cater to learners' needs and emphasizes the necessity of explicit AI literacy instruction. Limitations and future research directions are discussed, underlining the need for larger-scale studies and experimental designs that assess the impact of GAI on actual learning outcomes.

Keywords