On casting importance weighted autoencoder to an EM algorithm to learn deep generative models

Dongha Kim, Jaesung Hwang, Yongdai Kim
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:2153-2163, 2020.

Abstract

We propose a new and general approach to learn deep generative models. Our approach is based on a new observation that the importance weighted autoencoders (IWAE, Burda et al. (2015)) can be understood as a procedure of estimating the MLE with an EM algorithm. Utilizing this interpretation, we develop a new learning algorithm called importance weighted EM algorithm (IWEM). IWEM is an EM algorithm with self-normalized importance sampling (snIS) where the proposal distribution is carefully selected to reduce the variance due to snIS. In addition, we devise an annealing strategy to stabilize the learning algorithm. For missing data problems, we propose a modified IWEM algorithm called miss-IWEM. Using multiple benchmark datasets, we demonstrate empirically that our proposed methods outperform IWAE with significant margins for both fully-observed and missing data cases.

Cite this Paper


BibTeX
@InProceedings{pmlr-v108-kim20b, title = {On casting importance weighted autoencoder to an EM algorithm to learn deep generative models}, author = {Kim, Dongha and Hwang, Jaesung and Kim, Yongdai}, booktitle = {Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics}, pages = {2153--2163}, year = {2020}, editor = {Silvia Chiappa and Roberto Calandra}, volume = {108}, series = {Proceedings of Machine Learning Research}, month = {26--28 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v108/kim20b/kim20b.pdf}, url = { http://proceedings.mlr.press/v108/kim20b.html }, abstract = {We propose a new and general approach to learn deep generative models. Our approach is based on a new observation that the importance weighted autoencoders (IWAE, Burda et al. (2015)) can be understood as a procedure of estimating the MLE with an EM algorithm. Utilizing this interpretation, we develop a new learning algorithm called importance weighted EM algorithm (IWEM). IWEM is an EM algorithm with self-normalized importance sampling (snIS) where the proposal distribution is carefully selected to reduce the variance due to snIS. In addition, we devise an annealing strategy to stabilize the learning algorithm. For missing data problems, we propose a modified IWEM algorithm called miss-IWEM. Using multiple benchmark datasets, we demonstrate empirically that our proposed methods outperform IWAE with significant margins for both fully-observed and missing data cases.} }
Endnote
%0 Conference Paper %T On casting importance weighted autoencoder to an EM algorithm to learn deep generative models %A Dongha Kim %A Jaesung Hwang %A Yongdai Kim %B Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2020 %E Silvia Chiappa %E Roberto Calandra %F pmlr-v108-kim20b %I PMLR %P 2153--2163 %U http://proceedings.mlr.press/v108/kim20b.html %V 108 %X We propose a new and general approach to learn deep generative models. Our approach is based on a new observation that the importance weighted autoencoders (IWAE, Burda et al. (2015)) can be understood as a procedure of estimating the MLE with an EM algorithm. Utilizing this interpretation, we develop a new learning algorithm called importance weighted EM algorithm (IWEM). IWEM is an EM algorithm with self-normalized importance sampling (snIS) where the proposal distribution is carefully selected to reduce the variance due to snIS. In addition, we devise an annealing strategy to stabilize the learning algorithm. For missing data problems, we propose a modified IWEM algorithm called miss-IWEM. Using multiple benchmark datasets, we demonstrate empirically that our proposed methods outperform IWAE with significant margins for both fully-observed and missing data cases.
APA
Kim, D., Hwang, J. & Kim, Y.. (2020). On casting importance weighted autoencoder to an EM algorithm to learn deep generative models. Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 108:2153-2163 Available from http://proceedings.mlr.press/v108/kim20b.html .

Related Material