IEEE Access (Jan 2024)

Distractor Generation Through Text-to-Text Transformer Models

  • David De-Fitero-Dominguez,
  • Eva Garcia-Lopez,
  • Antonio Garcia-Cabot,
  • Jesus-Angel Del-Hoyo-Gabaldon,
  • Antonio Moreno-Cediel

DOI
https://doi.org/10.1109/ACCESS.2024.3361673
Journal volume & issue
Vol. 12
pp. 25580 – 25589

Abstract

Read online

In recent years, transformer language models have made a significant impact on automatic text generation. This study focuses on the task of distractor generation in Spanish using a fine-tuned multilingual text-to-text model, namely mT5. Our method outperformed established baselines based on LSTM networks, confirming the effectiveness of Transformer architectures in such NLP tasks. While comparisons with other Transformer-based solutions yielded diverse outcomes based on the metric of choice, our method notably achieved superior results on the ROUGE metric compared to the GPT-2 approach. Although traditional evaluation metrics such as BLEU and ROUGE are commonly used, this paper argues for more context-sensitive metrics given the inherent variability in acceptable distractor generation results. Among the contributions of this research is a comprehensive comparison with other methods, an examination of the potential drawbacks of multilingual models, and the introduction of alternative evaluation metrics. Future research directions, derived from our findings and a review of related works are also suggested, with a particular emphasis on leveraging other language models and Transformer architectures.

Keywords