Overcoming Mean-Field Approximations in Recurrent Gaussian Process Models

Alessandro Davide Ialongo, Mark Van Der Wilk, James Hensman, Carl Edward Rasmussen
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:2931-2940, 2019.

Abstract

We identify a new variational inference scheme for dynamical systems whose transition function is modelled by a Gaussian process. Inference in this setting has either employed computationally intensive MCMC methods, or relied on factorisations of the variational posterior. As we demonstrate in our experiments, the factorisation between latent system states and transition function can lead to a miscalibrated posterior and to learning unnecessarily large noise terms. We eliminate this factorisation by explicitly modelling the dependence between state trajectories and the low-rank representation of our Gaussian process posterior. Samples of the latent states can then be tractably generated by conditioning on this representation. The method we obtain gives better predictive performance and more calibrated estimates of the transition function, yet maintains the same time and space complexities as mean-field methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-ialongo19a, title = {Overcoming Mean-Field Approximations in Recurrent {G}aussian Process Models}, author = {Ialongo, Alessandro Davide and Van Der Wilk, Mark and Hensman, James and Rasmussen, Carl Edward}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {2931--2940}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/ialongo19a/ialongo19a.pdf}, url = {https://proceedings.mlr.press/v97/ialongo19a.html}, abstract = {We identify a new variational inference scheme for dynamical systems whose transition function is modelled by a Gaussian process. Inference in this setting has either employed computationally intensive MCMC methods, or relied on factorisations of the variational posterior. As we demonstrate in our experiments, the factorisation between latent system states and transition function can lead to a miscalibrated posterior and to learning unnecessarily large noise terms. We eliminate this factorisation by explicitly modelling the dependence between state trajectories and the low-rank representation of our Gaussian process posterior. Samples of the latent states can then be tractably generated by conditioning on this representation. The method we obtain gives better predictive performance and more calibrated estimates of the transition function, yet maintains the same time and space complexities as mean-field methods.} }
Endnote
%0 Conference Paper %T Overcoming Mean-Field Approximations in Recurrent Gaussian Process Models %A Alessandro Davide Ialongo %A Mark Van Der Wilk %A James Hensman %A Carl Edward Rasmussen %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-ialongo19a %I PMLR %P 2931--2940 %U https://proceedings.mlr.press/v97/ialongo19a.html %V 97 %X We identify a new variational inference scheme for dynamical systems whose transition function is modelled by a Gaussian process. Inference in this setting has either employed computationally intensive MCMC methods, or relied on factorisations of the variational posterior. As we demonstrate in our experiments, the factorisation between latent system states and transition function can lead to a miscalibrated posterior and to learning unnecessarily large noise terms. We eliminate this factorisation by explicitly modelling the dependence between state trajectories and the low-rank representation of our Gaussian process posterior. Samples of the latent states can then be tractably generated by conditioning on this representation. The method we obtain gives better predictive performance and more calibrated estimates of the transition function, yet maintains the same time and space complexities as mean-field methods.
APA
Ialongo, A.D., Van Der Wilk, M., Hensman, J. & Rasmussen, C.E.. (2019). Overcoming Mean-Field Approximations in Recurrent Gaussian Process Models. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:2931-2940 Available from https://proceedings.mlr.press/v97/ialongo19a.html.

Related Material