Traversing Time with Multi-Resolution Gaussian Process State-Space Models

Krista Longi, Jakob Lindinger, Olaf Duennbier, Melih Kandemir, Arto Klami, Barbara Rakitsch
Proceedings of The 4th Annual Learning for Dynamics and Control Conference, PMLR 168:366-377, 2022.

Abstract

Gaussian Process state-space models capture complex temporal dependencies in a principled manner by placing a Gaussian Process prior on the transition function. These models have a natural interpretation as discretized stochastic differential equations, but inference for long sequences with fast and slow transitions is difficult. Fast transitions need tight discretizations whereas slow transitions require backpropagating the gradients over long subtrajectories. We propose a novel Gaussian process state-space architecture composed of multiple components, each trained on a different resolution, to model effects on different timescales. The combined model allows traversing time on adaptive scales, providing efficient inference for arbitrarily long sequences with complex dynamics. We benchmark our novel method on semi-synthetic data and on an engine modeling task. In both experiments, our approach compares favorably against its state-of-the-art alternatives that operate on a single time-scale only.

Cite this Paper


BibTeX
@InProceedings{pmlr-v168-longi22a, title = {Traversing Time with Multi-Resolution Gaussian Process State-Space Models}, author = {Longi, Krista and Lindinger, Jakob and Duennbier, Olaf and Kandemir, Melih and Klami, Arto and Rakitsch, Barbara}, booktitle = {Proceedings of The 4th Annual Learning for Dynamics and Control Conference}, pages = {366--377}, year = {2022}, editor = {Firoozi, Roya and Mehr, Negar and Yel, Esen and Antonova, Rika and Bohg, Jeannette and Schwager, Mac and Kochenderfer, Mykel}, volume = {168}, series = {Proceedings of Machine Learning Research}, month = {23--24 Jun}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v168/longi22a/longi22a.pdf}, url = {https://proceedings.mlr.press/v168/longi22a.html}, abstract = {Gaussian Process state-space models capture complex temporal dependencies in a principled manner by placing a Gaussian Process prior on the transition function. These models have a natural interpretation as discretized stochastic differential equations, but inference for long sequences with fast and slow transitions is difficult. Fast transitions need tight discretizations whereas slow transitions require backpropagating the gradients over long subtrajectories. We propose a novel Gaussian process state-space architecture composed of multiple components, each trained on a different resolution, to model effects on different timescales. The combined model allows traversing time on adaptive scales, providing efficient inference for arbitrarily long sequences with complex dynamics. We benchmark our novel method on semi-synthetic data and on an engine modeling task. In both experiments, our approach compares favorably against its state-of-the-art alternatives that operate on a single time-scale only.} }
Endnote
%0 Conference Paper %T Traversing Time with Multi-Resolution Gaussian Process State-Space Models %A Krista Longi %A Jakob Lindinger %A Olaf Duennbier %A Melih Kandemir %A Arto Klami %A Barbara Rakitsch %B Proceedings of The 4th Annual Learning for Dynamics and Control Conference %C Proceedings of Machine Learning Research %D 2022 %E Roya Firoozi %E Negar Mehr %E Esen Yel %E Rika Antonova %E Jeannette Bohg %E Mac Schwager %E Mykel Kochenderfer %F pmlr-v168-longi22a %I PMLR %P 366--377 %U https://proceedings.mlr.press/v168/longi22a.html %V 168 %X Gaussian Process state-space models capture complex temporal dependencies in a principled manner by placing a Gaussian Process prior on the transition function. These models have a natural interpretation as discretized stochastic differential equations, but inference for long sequences with fast and slow transitions is difficult. Fast transitions need tight discretizations whereas slow transitions require backpropagating the gradients over long subtrajectories. We propose a novel Gaussian process state-space architecture composed of multiple components, each trained on a different resolution, to model effects on different timescales. The combined model allows traversing time on adaptive scales, providing efficient inference for arbitrarily long sequences with complex dynamics. We benchmark our novel method on semi-synthetic data and on an engine modeling task. In both experiments, our approach compares favorably against its state-of-the-art alternatives that operate on a single time-scale only.
APA
Longi, K., Lindinger, J., Duennbier, O., Kandemir, M., Klami, A. & Rakitsch, B.. (2022). Traversing Time with Multi-Resolution Gaussian Process State-Space Models. Proceedings of The 4th Annual Learning for Dynamics and Control Conference, in Proceedings of Machine Learning Research 168:366-377 Available from https://proceedings.mlr.press/v168/longi22a.html.

Related Material