Effective Neural Topic Modeling with Embedding Clustering Regularization

Xiaobao Wu, Xinshuai Dong, Thong Thanh Nguyen, Anh Tuan Luu
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:37335-37357, 2023.

Abstract

Topic models have been prevalent for decades with various applications. However, existing topic models commonly suffer from the notorious topic collapsing: discovered topics semantically collapse towards each other, leading to highly repetitive topics, insufficient topic discovery, and damaged model interpretability. In this paper, we propose a new neural topic model, Embedding Clustering Regularization Topic Model (ECRTM). Besides the existing reconstruction error, we propose a novel Embedding Clustering Regularization (ECR), which forces each topic embedding to be the center of a separately aggregated word embedding cluster in the semantic space. This enables each produced topic to contain distinct word semantics, which alleviates topic collapsing. Regularized by ECR, our ECRTM generates diverse and coherent topics together with high-quality topic distributions of documents. Extensive experiments on benchmark datasets demonstrate that ECRTM effectively addresses the topic collapsing issue and consistently surpasses state-of-the-art baselines in terms of topic quality, topic distributions of documents, and downstream classification tasks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-wu23c, title = {Effective Neural Topic Modeling with Embedding Clustering Regularization}, author = {Wu, Xiaobao and Dong, Xinshuai and Nguyen, Thong Thanh and Luu, Anh Tuan}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {37335--37357}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/wu23c/wu23c.pdf}, url = {https://proceedings.mlr.press/v202/wu23c.html}, abstract = {Topic models have been prevalent for decades with various applications. However, existing topic models commonly suffer from the notorious topic collapsing: discovered topics semantically collapse towards each other, leading to highly repetitive topics, insufficient topic discovery, and damaged model interpretability. In this paper, we propose a new neural topic model, Embedding Clustering Regularization Topic Model (ECRTM). Besides the existing reconstruction error, we propose a novel Embedding Clustering Regularization (ECR), which forces each topic embedding to be the center of a separately aggregated word embedding cluster in the semantic space. This enables each produced topic to contain distinct word semantics, which alleviates topic collapsing. Regularized by ECR, our ECRTM generates diverse and coherent topics together with high-quality topic distributions of documents. Extensive experiments on benchmark datasets demonstrate that ECRTM effectively addresses the topic collapsing issue and consistently surpasses state-of-the-art baselines in terms of topic quality, topic distributions of documents, and downstream classification tasks.} }
Endnote
%0 Conference Paper %T Effective Neural Topic Modeling with Embedding Clustering Regularization %A Xiaobao Wu %A Xinshuai Dong %A Thong Thanh Nguyen %A Anh Tuan Luu %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-wu23c %I PMLR %P 37335--37357 %U https://proceedings.mlr.press/v202/wu23c.html %V 202 %X Topic models have been prevalent for decades with various applications. However, existing topic models commonly suffer from the notorious topic collapsing: discovered topics semantically collapse towards each other, leading to highly repetitive topics, insufficient topic discovery, and damaged model interpretability. In this paper, we propose a new neural topic model, Embedding Clustering Regularization Topic Model (ECRTM). Besides the existing reconstruction error, we propose a novel Embedding Clustering Regularization (ECR), which forces each topic embedding to be the center of a separately aggregated word embedding cluster in the semantic space. This enables each produced topic to contain distinct word semantics, which alleviates topic collapsing. Regularized by ECR, our ECRTM generates diverse and coherent topics together with high-quality topic distributions of documents. Extensive experiments on benchmark datasets demonstrate that ECRTM effectively addresses the topic collapsing issue and consistently surpasses state-of-the-art baselines in terms of topic quality, topic distributions of documents, and downstream classification tasks.
APA
Wu, X., Dong, X., Nguyen, T.T. & Luu, A.T.. (2023). Effective Neural Topic Modeling with Embedding Clustering Regularization. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:37335-37357 Available from https://proceedings.mlr.press/v202/wu23c.html.

Related Material