Probabilistic Recurrent State-Space Models

Andreas Doerr, Christian Daniel, Martin Schiegg, Nguyen-Tuong Duy, Stefan Schaal, Marc Toussaint, Trimpe Sebastian
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:1280-1289, 2018.

Abstract

State-space models (SSMs) are a highly expressive model class for learning patterns in time series data and for system identification. Deterministic versions of SSMs (e.g., LSTMs) proved extremely successful in modeling complex time series data. Fully probabilistic SSMs, however, are often found hard to train, even for smaller problems. We propose a novel model formulation and a scalable training algorithm based on doubly stochastic variational inference and Gaussian processes. This combination allows efficient incorporation of latent state temporal correlations, which we found to be key to robust training. The effectiveness of the proposed PR-SSM is evaluated on a set of real-world benchmark datasets in comparison to state-of-the-art probabilistic model learning methods. Scalability and robustness are demonstrated on a high dimensional problem.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-doerr18a, title = {Probabilistic Recurrent State-Space Models}, author = {Doerr, Andreas and Daniel, Christian and Schiegg, Martin and Duy, Nguyen-Tuong and Schaal, Stefan and Toussaint, Marc and Sebastian, Trimpe}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {1280--1289}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/doerr18a/doerr18a.pdf}, url = {https://proceedings.mlr.press/v80/doerr18a.html}, abstract = {State-space models (SSMs) are a highly expressive model class for learning patterns in time series data and for system identification. Deterministic versions of SSMs (e.g., LSTMs) proved extremely successful in modeling complex time series data. Fully probabilistic SSMs, however, are often found hard to train, even for smaller problems. We propose a novel model formulation and a scalable training algorithm based on doubly stochastic variational inference and Gaussian processes. This combination allows efficient incorporation of latent state temporal correlations, which we found to be key to robust training. The effectiveness of the proposed PR-SSM is evaluated on a set of real-world benchmark datasets in comparison to state-of-the-art probabilistic model learning methods. Scalability and robustness are demonstrated on a high dimensional problem.} }
Endnote
%0 Conference Paper %T Probabilistic Recurrent State-Space Models %A Andreas Doerr %A Christian Daniel %A Martin Schiegg %A Nguyen-Tuong Duy %A Stefan Schaal %A Marc Toussaint %A Trimpe Sebastian %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-doerr18a %I PMLR %P 1280--1289 %U https://proceedings.mlr.press/v80/doerr18a.html %V 80 %X State-space models (SSMs) are a highly expressive model class for learning patterns in time series data and for system identification. Deterministic versions of SSMs (e.g., LSTMs) proved extremely successful in modeling complex time series data. Fully probabilistic SSMs, however, are often found hard to train, even for smaller problems. We propose a novel model formulation and a scalable training algorithm based on doubly stochastic variational inference and Gaussian processes. This combination allows efficient incorporation of latent state temporal correlations, which we found to be key to robust training. The effectiveness of the proposed PR-SSM is evaluated on a set of real-world benchmark datasets in comparison to state-of-the-art probabilistic model learning methods. Scalability and robustness are demonstrated on a high dimensional problem.
APA
Doerr, A., Daniel, C., Schiegg, M., Duy, N., Schaal, S., Toussaint, M. & Sebastian, T.. (2018). Probabilistic Recurrent State-Space Models. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:1280-1289 Available from https://proceedings.mlr.press/v80/doerr18a.html.

Related Material