On Sparse Variational Methods and the Kullback-Leibler Divergence between Stochastic Processes

Alexander G. de G. Matthews, James Hensman, Richard Turner, Zoubin Ghahramani
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:231-239, 2016.

Abstract

The variational framework for learning inducing variables (Titsias, 2009) has had a large impact on the Gaussian process literature. The framework may be interpreted as minimizing a rigorously defined Kullback-Leibler divergence between the approximating and posterior processes. To our knowledge this connection has thus far gone unremarked in the literature. In this paper we give a substantial generalization of the literature on this topic. We give a new proof of the result for infinite index sets which allows inducing points that are not data points and likelihoods that depend on all function values. We then discuss augmented index sets and show that, contrary to previous works, marginal consistency of augmentation is not enough to guarantee consistency of variational inference with the original model. We then characterize an extra condition where such a guarantee is obtainable. Finally we show how our framework sheds light on interdomain sparse approximations and sparse approximations for Cox processes.

Cite this Paper


BibTeX
@InProceedings{pmlr-v51-matthews16, title = {On Sparse Variational Methods and the Kullback-Leibler Divergence between Stochastic Processes}, author = {Matthews, Alexander G. de G. and Hensman, James and Turner, Richard and Ghahramani, Zoubin}, booktitle = {Proceedings of the 19th International Conference on Artificial Intelligence and Statistics}, pages = {231--239}, year = {2016}, editor = {Gretton, Arthur and Robert, Christian C.}, volume = {51}, series = {Proceedings of Machine Learning Research}, address = {Cadiz, Spain}, month = {09--11 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v51/matthews16.pdf}, url = {https://proceedings.mlr.press/v51/matthews16.html}, abstract = {The variational framework for learning inducing variables (Titsias, 2009) has had a large impact on the Gaussian process literature. The framework may be interpreted as minimizing a rigorously defined Kullback-Leibler divergence between the approximating and posterior processes. To our knowledge this connection has thus far gone unremarked in the literature. In this paper we give a substantial generalization of the literature on this topic. We give a new proof of the result for infinite index sets which allows inducing points that are not data points and likelihoods that depend on all function values. We then discuss augmented index sets and show that, contrary to previous works, marginal consistency of augmentation is not enough to guarantee consistency of variational inference with the original model. We then characterize an extra condition where such a guarantee is obtainable. Finally we show how our framework sheds light on interdomain sparse approximations and sparse approximations for Cox processes.} }
Endnote
%0 Conference Paper %T On Sparse Variational Methods and the Kullback-Leibler Divergence between Stochastic Processes %A Alexander G. de G. Matthews %A James Hensman %A Richard Turner %A Zoubin Ghahramani %B Proceedings of the 19th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2016 %E Arthur Gretton %E Christian C. Robert %F pmlr-v51-matthews16 %I PMLR %P 231--239 %U https://proceedings.mlr.press/v51/matthews16.html %V 51 %X The variational framework for learning inducing variables (Titsias, 2009) has had a large impact on the Gaussian process literature. The framework may be interpreted as minimizing a rigorously defined Kullback-Leibler divergence between the approximating and posterior processes. To our knowledge this connection has thus far gone unremarked in the literature. In this paper we give a substantial generalization of the literature on this topic. We give a new proof of the result for infinite index sets which allows inducing points that are not data points and likelihoods that depend on all function values. We then discuss augmented index sets and show that, contrary to previous works, marginal consistency of augmentation is not enough to guarantee consistency of variational inference with the original model. We then characterize an extra condition where such a guarantee is obtainable. Finally we show how our framework sheds light on interdomain sparse approximations and sparse approximations for Cox processes.
RIS
TY - CPAPER TI - On Sparse Variational Methods and the Kullback-Leibler Divergence between Stochastic Processes AU - Alexander G. de G. Matthews AU - James Hensman AU - Richard Turner AU - Zoubin Ghahramani BT - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics DA - 2016/05/02 ED - Arthur Gretton ED - Christian C. Robert ID - pmlr-v51-matthews16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 51 SP - 231 EP - 239 L1 - http://proceedings.mlr.press/v51/matthews16.pdf UR - https://proceedings.mlr.press/v51/matthews16.html AB - The variational framework for learning inducing variables (Titsias, 2009) has had a large impact on the Gaussian process literature. The framework may be interpreted as minimizing a rigorously defined Kullback-Leibler divergence between the approximating and posterior processes. To our knowledge this connection has thus far gone unremarked in the literature. In this paper we give a substantial generalization of the literature on this topic. We give a new proof of the result for infinite index sets which allows inducing points that are not data points and likelihoods that depend on all function values. We then discuss augmented index sets and show that, contrary to previous works, marginal consistency of augmentation is not enough to guarantee consistency of variational inference with the original model. We then characterize an extra condition where such a guarantee is obtainable. Finally we show how our framework sheds light on interdomain sparse approximations and sparse approximations for Cox processes. ER -
APA
Matthews, A.G.d.G., Hensman, J., Turner, R. & Ghahramani, Z.. (2016). On Sparse Variational Methods and the Kullback-Leibler Divergence between Stochastic Processes. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 51:231-239 Available from https://proceedings.mlr.press/v51/matthews16.html.

Related Material