Indian Journal of Ophthalmology (Jan 2021)
Item analysis and optimizing multiple-choice questions for a viable question bank in ophthalmology: A cross-sectional study
Abstract
Purpose: Multiple-choice questions (MCQs) are useful in assessing student performance, covering a wide range of topics in an objective way. Its reliability and validity depend upon how well it is constructed. Defective Item detected by item analysis must be looked for item writing flaws and optimized. The aim of this study was to evaluate the MCQs for difficulty levels, discriminating power with functional distractors by item analysis, analyze poor items for writing flaws, and optimize. Methods: This was a prospective cross-sectional study involving 120 MBBS students writing formative assessment in Ophthalmology. It comprised 40 single response MCQs as a part of 3-h paper for 20 marks. Items were categorized according to their difficulty index, discrimination index, and distractor efficiency with simple proportions, mean, standard deviation, and correlation. The defective items were analyzed for proper construction and optimized. Results: The mean score of the study group was 13.525 ± 2.617. Mean difficulty index, discrimination index, and distractor efficiency were 53.22, 0.26, and 78.32, respectively. Among 40 MCQs, twenty-five MCQs did not have non-functioning distractor; 7 had one, 5 had two, and 3 had three. Of the 20 defective items, 17 were optimized and added to the question bank, two were added without modification, and one was dropped. Conclusion: Item analysis is a valuable tool in detecting poor MCQs, and optimizing them is a critical step. The defective items identified should be optimized and not dropped so that the content area covered by the defective item is not kept of the assessment.
Keywords