Jurnal Geografi (Jul 2024)

Analysis of Multiple Choice HOTS Test Questions on the Final Semester Assessment

  • Budi Rahmah Panjaitan,
  • Epon Ningrum,
  • Bagja Waluya

DOI
https://doi.org/10.24114/jg.v16i2.55504
Journal volume & issue
Vol. 16, no. 2
pp. 170 – 180

Abstract

Read online

Assessment is essential in the education system to monitor educational development and make appropriate instructional decisions. This study aims to analyze the quality of multiple choice questions based on high-level thinking skills (HOTS) in the Geography subject using Anates Windows Version 4.0.9 software. This evaluative research involves analysis of final semester exam questions for class X Geography at SMA Islam Al Azhar 14 Semarang. Data consisting of questions, answer keys and exam results were collected from 32 students. The analysis was carried out descriptively and quantitatively to assess the level of difficulty, discriminating power, reliability, effectiveness of distractors, and validity of the questions. The analysis results show that the difficulty level of most of the questions is in the medium category (50%). The discriminating power is mostly good, although several questions are wrong or bad. Most of the distractors worked well, but some questions had ineffective distractors. Regarding validity, 17 questions were valid, while 11 questions were invalid. The reliability of the test is in the high category, with a reliability coefficient of 0.71. Anates Windows Version 4.0.9 software effectively analysed the quality of HOTS questions. These findings indicate that although most of the questions are of good quality, some require revision to increase their validity and effectiveness. This research emphasizes the importance of using technology in educational assessment to ensure the quality and fairness of exam questions. It contributes to developing assessment tools to improve students' higher-order thinking skills. Keywords: Item Analysis; Multiple Choice; HOTS; SMA Islam Al-Azhar