IEEE Access (Jan 2020)
A Framework for Automated Formative Assessment in Mathematics Courses
Abstract
In general terms, the aims of formative assessment are to gather and analyze information about the progress of students, with the intention of improving instruction in real time. This can also help teachers to identify those elements that students are struggling with, and therefore to determine where adjustments must be made. There are, however, some issues that must be addressed before this approach can become more widespread. In this research, and with mathematics as our specific scope, we focus on two of these issues: we present a framework to help teachers implement assessment items that are highly dynamic and flexible, but that allows them at the same time to model common misconceptions and to provide suitable feedback. Following this, we carried out an experiment to determine (i) whether the students perceived such feedback to be useful; (ii) whether they actually used it; and ultimately (iii) whether it produced a difference in their performance. To find the answers to these questions, we used an online `pre-calculus' course with 458 students and collected data directly from those students via both a perception survey and from their interaction with the LMS. Based on these data, we performed several analyses, including process mining and statistical hypothesis tests. The results demonstrated that although many of the students had a positive opinion of the feedback received, it was not always followed. Nevertheless, when it was (or at least when students said it was), the students performed better.
Keywords