Stochastic Backpropagation and Approximate Inference in Deep Generative Models

Danilo Jimenez Rezende, Shakir Mohamed, Daan Wierstra
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(2):1278-1286, 2014.

Abstract

We marry ideas from deep neural networks and approximate Bayesian inference to derive a generalised class of deep, directed generative models, endowed with a new algorithm for scalable inference and learning. Our algorithm introduces a recognition model to represent an approximate posterior distribution and uses this for optimisation of a variational lower bound. We develop stochastic backpropagation – rules for gradient backpropagation through stochastic variables – and derive an algorithm that allows for joint optimisation of the parameters of both the generative and recognition models. We demonstrate on several real-world data sets that by using stochastic backpropagation and variational inference, we obtain models that are able to generate realistic samples of data, allow for accurate imputations of missing data, and provide a useful tool for high-dimensional data visualisation.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-rezende14, title = {Stochastic Backpropagation and Approximate Inference in Deep Generative Models}, author = {Rezende, Danilo Jimenez and Mohamed, Shakir and Wierstra, Daan}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {1278--1286}, year = {2014}, editor = {Xing, Eric P. and Jebara, Tony}, volume = {32}, number = {2}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/rezende14.pdf}, url = {https://proceedings.mlr.press/v32/rezende14.html}, abstract = {We marry ideas from deep neural networks and approximate Bayesian inference to derive a generalised class of deep, directed generative models, endowed with a new algorithm for scalable inference and learning. Our algorithm introduces a recognition model to represent an approximate posterior distribution and uses this for optimisation of a variational lower bound. We develop stochastic backpropagation – rules for gradient backpropagation through stochastic variables – and derive an algorithm that allows for joint optimisation of the parameters of both the generative and recognition models. We demonstrate on several real-world data sets that by using stochastic backpropagation and variational inference, we obtain models that are able to generate realistic samples of data, allow for accurate imputations of missing data, and provide a useful tool for high-dimensional data visualisation.} }
Endnote
%0 Conference Paper %T Stochastic Backpropagation and Approximate Inference in Deep Generative Models %A Danilo Jimenez Rezende %A Shakir Mohamed %A Daan Wierstra %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-rezende14 %I PMLR %P 1278--1286 %U https://proceedings.mlr.press/v32/rezende14.html %V 32 %N 2 %X We marry ideas from deep neural networks and approximate Bayesian inference to derive a generalised class of deep, directed generative models, endowed with a new algorithm for scalable inference and learning. Our algorithm introduces a recognition model to represent an approximate posterior distribution and uses this for optimisation of a variational lower bound. We develop stochastic backpropagation – rules for gradient backpropagation through stochastic variables – and derive an algorithm that allows for joint optimisation of the parameters of both the generative and recognition models. We demonstrate on several real-world data sets that by using stochastic backpropagation and variational inference, we obtain models that are able to generate realistic samples of data, allow for accurate imputations of missing data, and provide a useful tool for high-dimensional data visualisation.
RIS
TY - CPAPER TI - Stochastic Backpropagation and Approximate Inference in Deep Generative Models AU - Danilo Jimenez Rezende AU - Shakir Mohamed AU - Daan Wierstra BT - Proceedings of the 31st International Conference on Machine Learning DA - 2014/06/18 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-rezende14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 32 IS - 2 SP - 1278 EP - 1286 L1 - http://proceedings.mlr.press/v32/rezende14.pdf UR - https://proceedings.mlr.press/v32/rezende14.html AB - We marry ideas from deep neural networks and approximate Bayesian inference to derive a generalised class of deep, directed generative models, endowed with a new algorithm for scalable inference and learning. Our algorithm introduces a recognition model to represent an approximate posterior distribution and uses this for optimisation of a variational lower bound. We develop stochastic backpropagation – rules for gradient backpropagation through stochastic variables – and derive an algorithm that allows for joint optimisation of the parameters of both the generative and recognition models. We demonstrate on several real-world data sets that by using stochastic backpropagation and variational inference, we obtain models that are able to generate realistic samples of data, allow for accurate imputations of missing data, and provide a useful tool for high-dimensional data visualisation. ER -
APA
Rezende, D.J., Mohamed, S. & Wierstra, D.. (2014). Stochastic Backpropagation and Approximate Inference in Deep Generative Models. Proceedings of the 31st International Conference on Machine Learning, in Proceedings of Machine Learning Research 32(2):1278-1286 Available from https://proceedings.mlr.press/v32/rezende14.html.

Related Material