IEEE Access (Jan 2019)

Language Model-Driven Topic Clustering and Summarization for News Articles

  • Peng Yang,
  • Wenhan Li,
  • Guangzhen Zhao

DOI
https://doi.org/10.1109/ACCESS.2019.2960538
Journal volume & issue
Vol. 7
pp. 185506 – 185519

Abstract

Read online

Topic models have been widely utilized in Topic Detection and Tracking tasks, which aim to detect, track, and describe topics from a stream of broadcast news reports. However, most existing topic models neglect semantic or syntactic information and lack readable topic descriptions. To exploit semantic and syntactic information, Language Models (LMs) have been applied in many supervised NLP tasks. However, there are still no extensions of LMs for unsupervised topic clustering. Moreover, it is difficult to employ general LMs (e.g., BERT) to produce readable topic summaries due to the mismatch between the pretraining method and the summarization task. In this paper, noticing the similarity between content and summary, first we propose a Language Model-based Topic Model (LMTM) for Topic Clustering by using an LM to generate a deep contextualized word representation. Then, a new method of training a Topic Summarization Model is introduced, where it is not only able to produce brief topic summaries but also used as an LM in LMTM for topic clustering. Empirical evaluations of two different datasets show that the proposed LMTM method achieves better performance over four baselines for JC, FMI, precision, recall and F1-score. Additionally, the generated readable and reasonable summaries also validate the rationality of our model components.

Keywords