TopicTag: Automatic Annotation of NMF Topic Models Using Chain of Thought and Prompt Tuning with LLMs

Abstract

Topic modeling is a technique for organizing and extracting themes from large collections of unstructured text. Non-negative matrix factorization (NMF) is a common unsupervised approach that decomposes a term frequency-inverse document frequency (TF-IDF) matrix to uncover latent topics and segment the dataset accordingly. While useful for highlighting patterns and clustering documents, NMF does not provide explicit topic labels, necessitating subject matter experts (SMEs) to assign labels manually. We present a methodology for automating topic labeling in documents clustered via NMF with automatic model determination (NMFk). By leveraging the output of NMFk and employing prompt engineering, we utilize large language models (LLMs) to generate accurate topic labels. Our case study on over 34,000 scientific abstracts on Knowledge Graphs demonstrates the effectiveness of our method in enhancing knowledge management and document organization.

Publication
In ACM Symposium on Document Engineering 2024 (DocEng ’24), 2024

Keywords:

nmf, topic labeling, llm, chain of thought, prompt tuning

Citation:

Wanna, S., Solovyev, N., Barron, R., Eren, M.E., Bhattarai, M., Rasmussen, K., Nicholas, C., and Alexandrov, B.. TopicTag: Automatic Annotation of NMF Topic Models Using Chain of Thought and Prompt Tuning with LLMs. In DocEng ’24: 24th ACM Symposium on Document Engineering, Aug. 20-23, 2024, Adobe, San Jose, CA. 4 pages.

BibTeX:

@article{wanna2024topictag,
  title={TopicTag: Automatic Annotation of NMF Topic Models Using Chain of Thought and Prompt Tuning with LLMs},
  author={Wanna, Selma and Barron, Ryan and Solovyev, Nick and Eren, Maksim E and Bhattarai, Manish and Rasmussen, Kim and Alexandrov, Boian S},
  journal={arXiv preprint arXiv:2407.19616},
  year={2024}
}
Maksim E. Eren
Maksim E. Eren
Scientist

My research interests lie at the intersection of the machine learning and cybersecurity disciplines, with a concentration in tensor decomposition.