Deep Exponential Families

Rajesh Ranganath, Linpeng Tang, Laurent Charlin, David Blei
Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics, PMLR 38:762-771, 2015.

Abstract

We describe deep exponential families (DEFs), a class of latent variable models that are inspired by the hidden structures used in deep neural networks. DEFs capture a hierarchy of dependencies between latent variables, and are easily generalized to many settings through exponential families. We perform inference using recent “black box" variational inference techniques. We then evaluate various DEFs on text and combine multiple DEFs into a model for pairwise recommendation data. In an extensive study, we show that going beyond one layer improves predictions for DEFs. We demonstrate that DEFs find interesting exploratory structure in large data sets, and give better predictive performance than state-of-the-art models.

Cite this Paper


BibTeX
@InProceedings{pmlr-v38-ranganath15, title = {{Deep Exponential Families}}, author = {Ranganath, Rajesh and Tang, Linpeng and Charlin, Laurent and Blei, David}, booktitle = {Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics}, pages = {762--771}, year = {2015}, editor = {Lebanon, Guy and Vishwanathan, S. V. N.}, volume = {38}, series = {Proceedings of Machine Learning Research}, address = {San Diego, California, USA}, month = {09--12 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v38/ranganath15.pdf}, url = {https://proceedings.mlr.press/v38/ranganath15.html}, abstract = {We describe deep exponential families (DEFs), a class of latent variable models that are inspired by the hidden structures used in deep neural networks. DEFs capture a hierarchy of dependencies between latent variables, and are easily generalized to many settings through exponential families. We perform inference using recent “black box" variational inference techniques. We then evaluate various DEFs on text and combine multiple DEFs into a model for pairwise recommendation data. In an extensive study, we show that going beyond one layer improves predictions for DEFs. We demonstrate that DEFs find interesting exploratory structure in large data sets, and give better predictive performance than state-of-the-art models.} }
Endnote
%0 Conference Paper %T Deep Exponential Families %A Rajesh Ranganath %A Linpeng Tang %A Laurent Charlin %A David Blei %B Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2015 %E Guy Lebanon %E S. V. N. Vishwanathan %F pmlr-v38-ranganath15 %I PMLR %P 762--771 %U https://proceedings.mlr.press/v38/ranganath15.html %V 38 %X We describe deep exponential families (DEFs), a class of latent variable models that are inspired by the hidden structures used in deep neural networks. DEFs capture a hierarchy of dependencies between latent variables, and are easily generalized to many settings through exponential families. We perform inference using recent “black box" variational inference techniques. We then evaluate various DEFs on text and combine multiple DEFs into a model for pairwise recommendation data. In an extensive study, we show that going beyond one layer improves predictions for DEFs. We demonstrate that DEFs find interesting exploratory structure in large data sets, and give better predictive performance than state-of-the-art models.
RIS
TY - CPAPER TI - Deep Exponential Families AU - Rajesh Ranganath AU - Linpeng Tang AU - Laurent Charlin AU - David Blei BT - Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics DA - 2015/02/21 ED - Guy Lebanon ED - S. V. N. Vishwanathan ID - pmlr-v38-ranganath15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 38 SP - 762 EP - 771 L1 - http://proceedings.mlr.press/v38/ranganath15.pdf UR - https://proceedings.mlr.press/v38/ranganath15.html AB - We describe deep exponential families (DEFs), a class of latent variable models that are inspired by the hidden structures used in deep neural networks. DEFs capture a hierarchy of dependencies between latent variables, and are easily generalized to many settings through exponential families. We perform inference using recent “black box" variational inference techniques. We then evaluate various DEFs on text and combine multiple DEFs into a model for pairwise recommendation data. In an extensive study, we show that going beyond one layer improves predictions for DEFs. We demonstrate that DEFs find interesting exploratory structure in large data sets, and give better predictive performance than state-of-the-art models. ER -
APA
Ranganath, R., Tang, L., Charlin, L. & Blei, D.. (2015). Deep Exponential Families. Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 38:762-771 Available from https://proceedings.mlr.press/v38/ranganath15.html.

Related Material