Computationally Efficient Bayesian Learning of Gaussian Process State Space Models

Andreas Svensson, Arno Solin, Simo Särkkä, Thomas Schön
Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:213-221, 2016.

Abstract

Gaussian processes allow for flexible specification of prior assumptions of unknown dynamics in state space models. We present a procedure for efficient Bayesian learning in Gaussian process state space models, where the representation is formed by projecting the problem onto a set of approximate eigenfunctions derived from the prior covariance structure. Learning under this family of models can be conducted using a carefully crafted particle MCMC algorithm. This scheme is computationally efficient and yet allows for a fully Bayesian treatment of the problem. Compared to conventional system identification tools or existing learning methods, we show competitive performance and reliable quantification of uncertainties in the model.

Cite this Paper


BibTeX
@InProceedings{pmlr-v51-svensson16, title = {Computationally Efficient Bayesian Learning of Gaussian Process State Space Models}, author = {Svensson, Andreas and Solin, Arno and Särkkä, Simo and Schön, Thomas}, booktitle = {Proceedings of the 19th International Conference on Artificial Intelligence and Statistics}, pages = {213--221}, year = {2016}, editor = {Gretton, Arthur and Robert, Christian C.}, volume = {51}, series = {Proceedings of Machine Learning Research}, address = {Cadiz, Spain}, month = {09--11 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v51/svensson16.pdf}, url = {https://proceedings.mlr.press/v51/svensson16.html}, abstract = {Gaussian processes allow for flexible specification of prior assumptions of unknown dynamics in state space models. We present a procedure for efficient Bayesian learning in Gaussian process state space models, where the representation is formed by projecting the problem onto a set of approximate eigenfunctions derived from the prior covariance structure. Learning under this family of models can be conducted using a carefully crafted particle MCMC algorithm. This scheme is computationally efficient and yet allows for a fully Bayesian treatment of the problem. Compared to conventional system identification tools or existing learning methods, we show competitive performance and reliable quantification of uncertainties in the model.} }
Endnote
%0 Conference Paper %T Computationally Efficient Bayesian Learning of Gaussian Process State Space Models %A Andreas Svensson %A Arno Solin %A Simo Särkkä %A Thomas Schön %B Proceedings of the 19th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2016 %E Arthur Gretton %E Christian C. Robert %F pmlr-v51-svensson16 %I PMLR %P 213--221 %U https://proceedings.mlr.press/v51/svensson16.html %V 51 %X Gaussian processes allow for flexible specification of prior assumptions of unknown dynamics in state space models. We present a procedure for efficient Bayesian learning in Gaussian process state space models, where the representation is formed by projecting the problem onto a set of approximate eigenfunctions derived from the prior covariance structure. Learning under this family of models can be conducted using a carefully crafted particle MCMC algorithm. This scheme is computationally efficient and yet allows for a fully Bayesian treatment of the problem. Compared to conventional system identification tools or existing learning methods, we show competitive performance and reliable quantification of uncertainties in the model.
RIS
TY - CPAPER TI - Computationally Efficient Bayesian Learning of Gaussian Process State Space Models AU - Andreas Svensson AU - Arno Solin AU - Simo Särkkä AU - Thomas Schön BT - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics DA - 2016/05/02 ED - Arthur Gretton ED - Christian C. Robert ID - pmlr-v51-svensson16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 51 SP - 213 EP - 221 L1 - http://proceedings.mlr.press/v51/svensson16.pdf UR - https://proceedings.mlr.press/v51/svensson16.html AB - Gaussian processes allow for flexible specification of prior assumptions of unknown dynamics in state space models. We present a procedure for efficient Bayesian learning in Gaussian process state space models, where the representation is formed by projecting the problem onto a set of approximate eigenfunctions derived from the prior covariance structure. Learning under this family of models can be conducted using a carefully crafted particle MCMC algorithm. This scheme is computationally efficient and yet allows for a fully Bayesian treatment of the problem. Compared to conventional system identification tools or existing learning methods, we show competitive performance and reliable quantification of uncertainties in the model. ER -
APA
Svensson, A., Solin, A., Särkkä, S. & Schön, T.. (2016). Computationally Efficient Bayesian Learning of Gaussian Process State Space Models. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 51:213-221 Available from https://proceedings.mlr.press/v51/svensson16.html.

Related Material