Frontiers in Psychology (Apr 2019)

Multiple-Choice Item Distractor Development Using Topic Modeling Approaches

  • Jinnie Shin,
  • Qi Guo,
  • Mark J. Gierl

DOI
https://doi.org/10.3389/fpsyg.2019.00825
Journal volume & issue
Vol. 10

Abstract

Read online

Writing a high-quality, multiple-choice test item is a complex process. Creating plausible but incorrect options for each item poses significant challenges for the content specialist because this task is often undertaken without implementing a systematic method. In the current study, we describe and demonstrate a systematic method for creating plausible but incorrect options, also called distractors, based on students’ misconceptions. These misconceptions are extracted from the labeled written responses. One thousand five hundred and fifteen written responses from an existing constructed-response item in Biology from Grade 10 students were used to demonstrate the method. Using a topic modeling procedure commonly used with machine learning and natural language processing called latent dirichlet allocation, 22 plausible misconceptions from students’ written responses were identified and used to produce a list of plausible distractors based on students’ responses. These distractors, in turn, were used as part of new multiple-choice items. Implications for item development are discussed.

Keywords