Kernel Topic Models

Philipp Hennig, David Stern, Ralf Herbrich, Thore Graepel
; Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, PMLR 22:511-519, 2012.

Abstract

Latent Dirichlet Allocation models discrete data as a mixture of discrete distributions, using Dirichlet beliefs over the mixture weights. We study a variation of this concept, in which the documents’ mixture weight beliefs are replaced with squashed Gaussian distributions. This allows documents to be associated with elements of a Hilbert space, admitting kernel topic models (KTM), modelling temporal, spatial, hierarchical, social and other structure between documents. The main challenge is efficient approximate inference on the latent Gaussian. We present an approximate algorithm cast around a Laplace approximation in a transformed basis. The KTM can also be interpreted as a type of Gaussian process latent variable model, or as a topic model conditional on document features, uncovering links between earlier work in these areas.

Cite this Paper


BibTeX
@InProceedings{pmlr-v22-hennig12, title = {Kernel Topic Models}, author = {Philipp Hennig and David Stern and Ralf Herbrich and Thore Graepel}, booktitle = {Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics}, pages = {511--519}, year = {2012}, editor = {Neil D. Lawrence and Mark Girolami}, volume = {22}, series = {Proceedings of Machine Learning Research}, address = {La Palma, Canary Islands}, month = {21--23 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v22/hennig12/hennig12.pdf}, url = {http://proceedings.mlr.press/v22/hennig12.html}, abstract = {Latent Dirichlet Allocation models discrete data as a mixture of discrete distributions, using Dirichlet beliefs over the mixture weights. We study a variation of this concept, in which the documents’ mixture weight beliefs are replaced with squashed Gaussian distributions. This allows documents to be associated with elements of a Hilbert space, admitting kernel topic models (KTM), modelling temporal, spatial, hierarchical, social and other structure between documents. The main challenge is efficient approximate inference on the latent Gaussian. We present an approximate algorithm cast around a Laplace approximation in a transformed basis. The KTM can also be interpreted as a type of Gaussian process latent variable model, or as a topic model conditional on document features, uncovering links between earlier work in these areas.} }
Endnote
%0 Conference Paper %T Kernel Topic Models %A Philipp Hennig %A David Stern %A Ralf Herbrich %A Thore Graepel %B Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2012 %E Neil D. Lawrence %E Mark Girolami %F pmlr-v22-hennig12 %I PMLR %J Proceedings of Machine Learning Research %P 511--519 %U http://proceedings.mlr.press %V 22 %W PMLR %X Latent Dirichlet Allocation models discrete data as a mixture of discrete distributions, using Dirichlet beliefs over the mixture weights. We study a variation of this concept, in which the documents’ mixture weight beliefs are replaced with squashed Gaussian distributions. This allows documents to be associated with elements of a Hilbert space, admitting kernel topic models (KTM), modelling temporal, spatial, hierarchical, social and other structure between documents. The main challenge is efficient approximate inference on the latent Gaussian. We present an approximate algorithm cast around a Laplace approximation in a transformed basis. The KTM can also be interpreted as a type of Gaussian process latent variable model, or as a topic model conditional on document features, uncovering links between earlier work in these areas.
RIS
TY - CPAPER TI - Kernel Topic Models AU - Philipp Hennig AU - David Stern AU - Ralf Herbrich AU - Thore Graepel BT - Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics PY - 2012/03/21 DA - 2012/03/21 ED - Neil D. Lawrence ED - Mark Girolami ID - pmlr-v22-hennig12 PB - PMLR SP - 511 DP - PMLR EP - 519 L1 - http://proceedings.mlr.press/v22/hennig12/hennig12.pdf UR - http://proceedings.mlr.press/v22/hennig12.html AB - Latent Dirichlet Allocation models discrete data as a mixture of discrete distributions, using Dirichlet beliefs over the mixture weights. We study a variation of this concept, in which the documents’ mixture weight beliefs are replaced with squashed Gaussian distributions. This allows documents to be associated with elements of a Hilbert space, admitting kernel topic models (KTM), modelling temporal, spatial, hierarchical, social and other structure between documents. The main challenge is efficient approximate inference on the latent Gaussian. We present an approximate algorithm cast around a Laplace approximation in a transformed basis. The KTM can also be interpreted as a type of Gaussian process latent variable model, or as a topic model conditional on document features, uncovering links between earlier work in these areas. ER -
APA
Hennig, P., Stern, D., Herbrich, R. & Graepel, T.. (2012). Kernel Topic Models. Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, in PMLR 22:511-519

Related Material