Sensors & Transducers (Feb 2021)

Experimental Comparison of Transformers and Reformers for Text Classification

  • Roghayeh Soleymani,
  • Julien Beaulieu,
  • Jérémie Farret

Journal volume & issue
Vol. 249, no. 2
pp. 110 – 118

Abstract

Read online

In this paper, we present experimental analysis of Transformers and Reformers for text classification applications in natural language processing. Transformers and Reformers yield the state-of-the-art performance and use attention scores for capturing the relationships between words in the sentences which can be computed in parallel on GPU clusters. Reformers improve Transformers to lower time and memory complexity. We will present our evaluation and analysis of applicable architectures for such improved performances. The experiments in this paper are done in Trax on Mind in a Box with four different datasets and under different hyperparameter tuning. We observe that Transformers achieve better performance than Reformers in terms of accuracy and training speed for text classification. However, Reformers allow the training of bigger models which would otherwise cause memory failures with Transformers.

Keywords