Doubly SemiImplicit Variational Inference
[edit]
Proceedings of Machine Learning Research, PMLR 89:25932602, 2019.
Abstract
We extend the existing framework of semiimplicit variational inference (SIVI) and introduce doubly semiimplicit variational inference (DSIVI), a way to perform variational inference and learning when both the approximate posterior and the prior distribution are semiimplicit. In other words, DSIVI performs inference in models where the prior and the posterior can be expressed as an intractable infinite mixture of some analytic density with a highly flexible implicit mixing distribution. We provide a sandwich bound on the evidence lower bound (ELBO) objective that can be made arbitrarily tight. Unlike discriminatorbased and kernelbased approaches to implicit variational inference, DSIVI optimizes a proper lower bound on ELBO that is asymptotically exact. We evaluate DSIVI on a set of problems that benefit from implicit priors. In particular, we show that DSIVI gives rise to a simple modification of VampPrior, the current stateoftheart prior for variational autoencoders, which improves its performance.
Related Material


