Sensors (Mar 2023)

Efficient Memory-Enhanced Transformer for Long-Document Summarization in Low-Resource Regimes

  • Gianluca Moro,
  • Luca Ragazzi,
  • Lorenzo Valgimigli,
  • Giacomo Frisoni,
  • Claudio Sartori,
  • Gustavo Marfia

DOI
https://doi.org/10.3390/s23073542
Journal volume & issue
Vol. 23, no. 7
p. 3542

Abstract

Read online

Long document summarization poses obstacles to current generative transformer-based models because of the broad context to process and understand. Indeed, detecting long-range dependencies is still challenging for today’s state-of-the-art solutions, usually requiring model expansion at the cost of an unsustainable demand for computing and memory capacities. This paper introduces Emma, a novel efficient memory-enhanced transformer-based architecture. By segmenting a lengthy input into multiple text fragments, our model stores and compares the current chunk with previous ones, gaining the capability to read and comprehend the entire context over the whole document with a fixed amount of GPU memory. This method enables the model to deal with theoretically infinitely long documents, using less than 18 and 13 GB of memory for training and inference, respectively. We conducted extensive performance analyses and demonstrate that Emma achieved competitive results on two datasets of different domains while consuming significantly less GPU memory than competitors do, even in low-resource settings.

Keywords