[edit]
On casting importance weighted autoencoder to an EM algorithm to learn deep generative models
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:2153-2163, 2020.
Abstract
We propose a new and general approach to learn deep generative models. Our approach is based on a new observation that the importance weighted autoencoders (IWAE, Burda et al. (2015)) can be understood as a procedure of estimating the MLE with an EM algorithm. Utilizing this interpretation, we develop a new learning algorithm called importance weighted EM algorithm (IWEM). IWEM is an EM algorithm with self-normalized importance sampling (snIS) where the proposal distribution is carefully selected to reduce the variance due to snIS. In addition, we devise an annealing strategy to stabilize the learning algorithm. For missing data problems, we propose a modified IWEM algorithm called miss-IWEM. Using multiple benchmark datasets, we demonstrate empirically that our proposed methods outperform IWAE with significant margins for both fully-observed and missing data cases.