REID (Research and Evaluation in Education) (Jun 2024)

A psychometric evaluation of an item bank for an English reading comprehension tool using Rasch analysis

  • Louis Wai Keung Yim,
  • Che Yee Lye,
  • Poh Wee Koh

DOI
https://doi.org/10.21831/reid.v10i1.65284
Journal volume & issue
Vol. 10, no. 1
pp. 18 – 34

Abstract

Read online

This study reports the psychometric evaluation of an item bank for an Assessment for Learning (AfL) tool to assess primary school students’ reading comprehension skills. A pool of 46 primary 1 to 6 reading passages and their accompanying 522 multiple choice and short answer items were developed based on the Progress in International Reading Literacy Study (PIRLS) assessment framework. They were field-tested at 27 schools in Singapore involving 9834 students aged between 7 and 13. Four main comprehension processes outlined in PIRLS were assessed: focusing on and retrieving explicitly stated information, making straightforward inferences, interpreting and integrating ideas and information, and evaluating and critiquing content and textual elements. Rasch analysis was employed to examine students’ item response patterns for (1) model and item fit; (2) differential item functioning (DIF) about gender and test platform used; (3) local item dependence (LID) within and amongst reading passages; and (4) distractor issues about options within the multiple-choice-type items. Results showed that the data adequately fit the unidimensional Rasch model across all test levels with good internal consistency. Psycho metric issues found amongst items were primarily related to ill-functioning distractors and local dependence on items. Problematic items identified were reviewed and subsequently amended by a panel of assessment professionals for future recalibration. This psychometrically and theoretically sound item bank is envisaged to be valuable to developing comprehensive classroom AfL tools that provide information for the English reading comprehension instructional design in the Singaporean context.

Keywords