IEEE Access (Jan 2024)

Exploring Topic Coherence With PCC-LDA and BERT for Contextual Word Generation

  • Sandeep Kumar Rachamadugu,
  • T. P. Pushphavathi,
  • Surbhi Bhatia Khan,
  • Mohammad Alojail

DOI
https://doi.org/10.1109/ACCESS.2024.3477992
Journal volume & issue
Vol. 12
pp. 175252 – 175267

Abstract

Read online

In the field of natural language processing (NLP), topic modeling and word generation are crucial for comprehending and producing texts that resemble human languages. Extracting key phrases is an essential task that aids document summarization, information retrieval, and topic classification. Topic modeling significantly enhances our understanding of the latent structure of textual data. Latent Dirichlet Allocation (LDA) is a popular algorithm for topic modeling, which assumes that every document is a mix of several topics, and each topic will have multiple words. A new model similar to LDA, but a better version called Probabilistic Correlated Clustering Latent Dirichlet Allocation (PCC-LDA) was recently introduced. On the other hand, BERT is an advanced bidirectional pre-trained language model that understands words in a sentence based on the full context to generate more precise and contextually correct words. Topic modeling is a useful way to discover hidden themes or topics within a range of documents aiming to tune better topics from the corpus and enhance topic modeling implementation. The experiments indicated a significant improvement in performance when using this combination approach. Coherence criteria of are utilized to judge whether the words in each topic accord with prior knowledge, which could ensure that topics are interpretable and meaningful. The above results of the topic-level analysis indicate that PCC-LDA consistency topics perform better than LDA and NMF(non-negative matrix factorization Technique) by at least 15.4%,12.9%( $k = 5$ ) and up to nearly 12.5% and 11.8% ( $k = 10$ ) respectively, where k represents the number of topics.

Keywords