Variational Inference for Sequential Data with Future Likelihood Estimates

Geon-Hyeong Kim, Youngsoo Jang, Hongseok Yang, Kee-Eung Kim
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:5296-5305, 2020.

Abstract

The recent development of flexible and scalable variational inference algorithms has popularized the use of deep probabilistic models in a wide range of applications. However, learning and reasoning about high-dimensional models with nondifferentiable densities are still a challenge. For such a model, inference algorithms struggle to estimate the gradients of variational objectives accurately, due to high variance in their estimates. To tackle this challenge, we present a novel variational inference algorithm for sequential data, which performs well even when the density from the model is not differentiable, for instance, due to the use of discrete random variables. The key feature of our algorithm is that it estimates future likelihoods at all time steps. The estimated future likelihoods form the core of our new low-variance gradient estimator. We formally analyze our gradient estimator from the perspective of variational objective, and show the effectiveness of our algorithm with synthetic and real datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-kim20d, title = {Variational Inference for Sequential Data with Future Likelihood Estimates}, author = {Kim, Geon-Hyeong and Jang, Youngsoo and Yang, Hongseok and Kim, Kee-Eung}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {5296--5305}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/kim20d/kim20d.pdf}, url = {https://proceedings.mlr.press/v119/kim20d.html}, abstract = {The recent development of flexible and scalable variational inference algorithms has popularized the use of deep probabilistic models in a wide range of applications. However, learning and reasoning about high-dimensional models with nondifferentiable densities are still a challenge. For such a model, inference algorithms struggle to estimate the gradients of variational objectives accurately, due to high variance in their estimates. To tackle this challenge, we present a novel variational inference algorithm for sequential data, which performs well even when the density from the model is not differentiable, for instance, due to the use of discrete random variables. The key feature of our algorithm is that it estimates future likelihoods at all time steps. The estimated future likelihoods form the core of our new low-variance gradient estimator. We formally analyze our gradient estimator from the perspective of variational objective, and show the effectiveness of our algorithm with synthetic and real datasets.} }
Endnote
%0 Conference Paper %T Variational Inference for Sequential Data with Future Likelihood Estimates %A Geon-Hyeong Kim %A Youngsoo Jang %A Hongseok Yang %A Kee-Eung Kim %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-kim20d %I PMLR %P 5296--5305 %U https://proceedings.mlr.press/v119/kim20d.html %V 119 %X The recent development of flexible and scalable variational inference algorithms has popularized the use of deep probabilistic models in a wide range of applications. However, learning and reasoning about high-dimensional models with nondifferentiable densities are still a challenge. For such a model, inference algorithms struggle to estimate the gradients of variational objectives accurately, due to high variance in their estimates. To tackle this challenge, we present a novel variational inference algorithm for sequential data, which performs well even when the density from the model is not differentiable, for instance, due to the use of discrete random variables. The key feature of our algorithm is that it estimates future likelihoods at all time steps. The estimated future likelihoods form the core of our new low-variance gradient estimator. We formally analyze our gradient estimator from the perspective of variational objective, and show the effectiveness of our algorithm with synthetic and real datasets.
APA
Kim, G., Jang, Y., Yang, H. & Kim, K.. (2020). Variational Inference for Sequential Data with Future Likelihood Estimates. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:5296-5305 Available from https://proceedings.mlr.press/v119/kim20d.html.

Related Material