Journal of CME (Dec 2024)

Comparison of Multiple-Choice Question Formats in a First Year Medical Physiology Course

  • L. Britt Wilson,
  • Christine DiStefano,
  • Huijuan Wang,
  • Erika L. Blanck

DOI
https://doi.org/10.1080/28338073.2024.2390264
Journal volume & issue
Vol. 13, no. 1

Abstract

Read online

The purpose of this study was to compare student performance and question discrimination of multiple-choice questions (MCQs) that followed a standard format (SF) versus those that do not follow a SF, termed here as non-standard format (NSF). Medical physiology exam results of approximately 500 first-year medical students collected over a five-year period (2020–2024) were used. Classical test theory item analysis indices, e.g. discrimination (D), point-biserial correlation (rpbis), distractor analysis for non-functional distractors (NFDs), and difficulty (p) were determined and compared across MCQ format types. The results presented here are the mean ± standard error of the mean (SEM). The analysis showed that D (0.278 ± 0.008 vs 0.228 ± 0.006) and rpbis (0.291 ± .006 vs 0.273 ± .006) were significantly higher for NSF questions compared to SF questions, indicating NSF questions provided more discriminatory power. In addition, the percentage of NFDs was lower for the NSF items compared to the SF ones (58.3 ± 0.019% vs 70.2 ± 0.015%). Also, the NSF questions proved to be more difficult relative to the SF questions (p = 0.741 ± 0.007 for NSF; p = 0.809 ± 0.006 for SF). Thus, the NSF questions discriminated better, had fewer NFDs, and were more difficult than SF questions. These data suggest that using the selected non-standard item writing questions can enhance the ability to discriminate higher performers from lower performers on MCQs as well as provide more rigour for exams.

Keywords