IEEE Access (Jan 2024)

Descriptive Answers Evaluation Using Natural Language Processing Approaches

  • Lalitha Manasa Chandrapati,
  • Ch. Koteswara Rao

DOI
https://doi.org/10.1109/ACCESS.2024.3417706
Journal volume & issue
Vol. 12
pp. 87333 – 87347

Abstract

Read online

Answer scripts are an important aspect in evaluating student’s performance. Evaluating papers from a descriptive outlook can be a challenging and exhausting task. Typically, answer script evaluations are conducted dynamically, which can lead to bias and can be quite time-consuming. Various efforts have been made to automate the evaluation of student responses with the usage of Artificial Intelligence techniques. Yet, most of the work relies on particular words or typical counts to accomplish this task. In addition, there is a shortage of organized data sets too. In this research a novel ensemble model Descriptive answer evaluation system(DAES) is introduced, which integrates Topic Modelling (TM) and Question Answering (QA) models for automatically evaluating descriptive answers. Latent Dirichlet Allocation (LDA) and a fine-tuned Text-to-Text Transfer Transformer(T5) models were utilized to identify key topics and the correctness of specific statements within the student answers. Sentence-BERT is utilized to encode sentences and cosine similarity method is applied to generate similarity scores. For this approach, LDA studies thematic evaluation, T5 assess for semantic analysis of the student answer. A final score is given to each answer after a thorough review procedure using predetermined criteria. Experiments results in achieving an accuracy of 95%, precision of 94%, recall 95% and f1-score of 94% on training data by using the proposed model.

Keywords