Language Testing in Asia (May 2017)

An overview of differential item functioning in multistage computer adaptive testing using three-parameter logistic item response theory

  • Karim Sadeghi,
  • Zainab Abolfazli Khonbi

DOI
https://doi.org/10.1186/s40468-017-0038-z
Journal volume & issue
Vol. 7, no. 1
pp. 1 – 16

Abstract

Read online

Abstract As perfectly summarised by Ida Lawrence, “Testing is growing by leaps and bounds across the world. There is a realization that a nation’s well-being depends crucially on the educational achievement of its population. Valid tests are an essential tool to evaluate a nation’s educational standing and to implement efficacious educational reforms. Because tests consume time that otherwise could be devoted to instruction, it is important to devise tests that are efficient. Doing so requires a careful balancing of the contributions of technology, psychometrics, test design, and the learning sciences. Computer adaptive multistage testing (MSCAT) fits the bill extraordinarily well; unlike other forms of adaptive testing, it can be adapted to educational surveys and student testing. Research in this area will be an evidence that the methodologies and underlying technology that surround MSCAT have reached maturity and that there is a growing acceptance by the field of this type of test design” (from the Foreword to Y. Duanli, A. A. von Davier, & L. Charles (Eds.), Computerized multistage testing: theory and application). This state-of-the-art paper aims to present an overview of differential item functioning (DIF) in MSCAT using three-parameter logistic item response theory (IRT), offering suggestions to implement it in practice with a hope to motivate testing and assessment researchers and practitioners to initiate projects in this under-practiced area by helping them to better understand some of the relevant technical concepts.

Keywords