Transactions of the Association for Computational Linguistics (Jan 2021)

Extractive Opinion Summarization in Quantized Transformer Spaces

  • Stefanos Angelidis,
  • Reinald Kim Amplayo,
  • Yoshihiko Suhara,
  • Xiaolan Wang,
  • Mirella Lapata

DOI
https://doi.org/10.1162/tacl_a_00366
Journal volume & issue
Vol. 9
pp. 277 – 293

Abstract

Read online

AbstractWe present the Quantized Transformer (QT), an unsupervised system for extractive opinion summarization. QT is inspired by Vector- Quantized Variational Autoencoders, which we repurpose for popularity-driven summarization. It uses a clustering interpretation of the quantized space and a novel extraction algorithm to discover popular opinions among hundreds of reviews, a significant step towards opinion summarization of practical scope. In addition, QT enables controllable summarization without further training, by utilizing properties of the quantized space to extract aspect-specific summaries. We also make publicly available Space, a large-scale evaluation benchmark for opinion summarizers, comprising general and aspect-specific summaries for 50 hotels. Experiments demonstrate the promise of our approach, which is validated by human studies where judges showed clear preference for our method over competitive baselines.