Hierarchical Decompositional Mixtures of Variational Autoencoders

Ping Liang Tan, Robert Peharz
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:6115-6124, 2019.

Abstract

Variational autoencoders (VAEs) have received considerable attention, since they allow us to learn expressive neural density estimators effectively and efficiently. However, learning and inference in VAEs is still problematic due to the sensitive interplay between the generative model and the inference network. Since these problems become generally more severe in high dimensions, we propose a novel hierarchical mixture model over low-dimensional VAE experts. Our model decomposes the overall learning problem into many smaller problems, which are coordinated by the hierarchical mixture, represented by a sum-product network. In experiments we show that our models outperform classical VAEs on almost all of our experimental benchmarks. Moreover, we show that our model is highly data efficient and degrades very gracefully in extremely low data regimes.ow data regimes.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-tan19b, title = {Hierarchical Decompositional Mixtures of Variational Autoencoders}, author = {Tan, Ping Liang and Peharz, Robert}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {6115--6124}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/tan19b/tan19b.pdf}, url = {https://proceedings.mlr.press/v97/tan19b.html}, abstract = {Variational autoencoders (VAEs) have received considerable attention, since they allow us to learn expressive neural density estimators effectively and efficiently. However, learning and inference in VAEs is still problematic due to the sensitive interplay between the generative model and the inference network. Since these problems become generally more severe in high dimensions, we propose a novel hierarchical mixture model over low-dimensional VAE experts. Our model decomposes the overall learning problem into many smaller problems, which are coordinated by the hierarchical mixture, represented by a sum-product network. In experiments we show that our models outperform classical VAEs on almost all of our experimental benchmarks. Moreover, we show that our model is highly data efficient and degrades very gracefully in extremely low data regimes.ow data regimes.} }
Endnote
%0 Conference Paper %T Hierarchical Decompositional Mixtures of Variational Autoencoders %A Ping Liang Tan %A Robert Peharz %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-tan19b %I PMLR %P 6115--6124 %U https://proceedings.mlr.press/v97/tan19b.html %V 97 %X Variational autoencoders (VAEs) have received considerable attention, since they allow us to learn expressive neural density estimators effectively and efficiently. However, learning and inference in VAEs is still problematic due to the sensitive interplay between the generative model and the inference network. Since these problems become generally more severe in high dimensions, we propose a novel hierarchical mixture model over low-dimensional VAE experts. Our model decomposes the overall learning problem into many smaller problems, which are coordinated by the hierarchical mixture, represented by a sum-product network. In experiments we show that our models outperform classical VAEs on almost all of our experimental benchmarks. Moreover, we show that our model is highly data efficient and degrades very gracefully in extremely low data regimes.ow data regimes.
APA
Tan, P.L. & Peharz, R.. (2019). Hierarchical Decompositional Mixtures of Variational Autoencoders. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:6115-6124 Available from https://proceedings.mlr.press/v97/tan19b.html.

Related Material