Auxiliary Deep Generative Models

Lars Maaløe, Casper Kaae Sønderby, Søren Kaae Sønderby, Ole Winther
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:1445-1453, 2016.

Abstract

Deep generative models parameterized by neural networks have recently achieved state-of-the-art performance in unsupervised and semi-supervised learning. We extend deep generative models with auxiliary variables which improves the variational approximation. The auxiliary variables leave the generative model unchanged but make the variational distribution more expressive. Inspired by the structure of the auxiliary variable we also propose a model with two stochastic layers and skip connections. Our findings suggest that more expressive and properly specified deep generative models converge faster with better results. We show state-of-the-art performance within semi-supervised learning on MNIST, SVHN and NORB datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v48-maaloe16, title = {Auxiliary Deep Generative Models}, author = {Maaløe, Lars and Sønderby, Casper Kaae and Sønderby, Søren Kaae and Winther, Ole}, booktitle = {Proceedings of The 33rd International Conference on Machine Learning}, pages = {1445--1453}, year = {2016}, editor = {Balcan, Maria Florina and Weinberger, Kilian Q.}, volume = {48}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {20--22 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v48/maaloe16.pdf}, url = {https://proceedings.mlr.press/v48/maaloe16.html}, abstract = {Deep generative models parameterized by neural networks have recently achieved state-of-the-art performance in unsupervised and semi-supervised learning. We extend deep generative models with auxiliary variables which improves the variational approximation. The auxiliary variables leave the generative model unchanged but make the variational distribution more expressive. Inspired by the structure of the auxiliary variable we also propose a model with two stochastic layers and skip connections. Our findings suggest that more expressive and properly specified deep generative models converge faster with better results. We show state-of-the-art performance within semi-supervised learning on MNIST, SVHN and NORB datasets.} }
Endnote
%0 Conference Paper %T Auxiliary Deep Generative Models %A Lars Maaløe %A Casper Kaae Sønderby %A Søren Kaae Sønderby %A Ole Winther %B Proceedings of The 33rd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Maria Florina Balcan %E Kilian Q. Weinberger %F pmlr-v48-maaloe16 %I PMLR %P 1445--1453 %U https://proceedings.mlr.press/v48/maaloe16.html %V 48 %X Deep generative models parameterized by neural networks have recently achieved state-of-the-art performance in unsupervised and semi-supervised learning. We extend deep generative models with auxiliary variables which improves the variational approximation. The auxiliary variables leave the generative model unchanged but make the variational distribution more expressive. Inspired by the structure of the auxiliary variable we also propose a model with two stochastic layers and skip connections. Our findings suggest that more expressive and properly specified deep generative models converge faster with better results. We show state-of-the-art performance within semi-supervised learning on MNIST, SVHN and NORB datasets.
RIS
TY - CPAPER TI - Auxiliary Deep Generative Models AU - Lars Maaløe AU - Casper Kaae Sønderby AU - Søren Kaae Sønderby AU - Ole Winther BT - Proceedings of The 33rd International Conference on Machine Learning DA - 2016/06/11 ED - Maria Florina Balcan ED - Kilian Q. Weinberger ID - pmlr-v48-maaloe16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 48 SP - 1445 EP - 1453 L1 - http://proceedings.mlr.press/v48/maaloe16.pdf UR - https://proceedings.mlr.press/v48/maaloe16.html AB - Deep generative models parameterized by neural networks have recently achieved state-of-the-art performance in unsupervised and semi-supervised learning. We extend deep generative models with auxiliary variables which improves the variational approximation. The auxiliary variables leave the generative model unchanged but make the variational distribution more expressive. Inspired by the structure of the auxiliary variable we also propose a model with two stochastic layers and skip connections. Our findings suggest that more expressive and properly specified deep generative models converge faster with better results. We show state-of-the-art performance within semi-supervised learning on MNIST, SVHN and NORB datasets. ER -
APA
Maaløe, L., Sønderby, C.K., Sønderby, S.K. & Winther, O.. (2016). Auxiliary Deep Generative Models. Proceedings of The 33rd International Conference on Machine Learning, in Proceedings of Machine Learning Research 48:1445-1453 Available from https://proceedings.mlr.press/v48/maaloe16.html.

Related Material