Bayesian Learning from Sequential Data using Gaussian Processes with Signature Covariances

Csaba Toth, Harald Oberhauser
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:9548-9560, 2020.

Abstract

We develop a Bayesian approach to learning from sequential data by using Gaussian processes (GPs) with so-called signature kernels as covariance functions. This allows to make sequences of different length comparable and to rely on strong theoretical results from stochastic analysis. Signatures capture sequential structure with tensors that can scale unfavourably in sequence length and state space dimension. To deal with this, we introduce a sparse variational approach with inducing tensors. We then combine the resulting GP with LSTMs and GRUs to build larger models that leverage the strengths of each of these approaches and benchmark the resulting GPs on multivariate time series (TS) classification datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-toth20a, title = {{B}ayesian Learning from Sequential Data using {G}aussian Processes with Signature Covariances}, author = {Toth, Csaba and Oberhauser, Harald}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {9548--9560}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/toth20a/toth20a.pdf}, url = {https://proceedings.mlr.press/v119/toth20a.html}, abstract = {We develop a Bayesian approach to learning from sequential data by using Gaussian processes (GPs) with so-called signature kernels as covariance functions. This allows to make sequences of different length comparable and to rely on strong theoretical results from stochastic analysis. Signatures capture sequential structure with tensors that can scale unfavourably in sequence length and state space dimension. To deal with this, we introduce a sparse variational approach with inducing tensors. We then combine the resulting GP with LSTMs and GRUs to build larger models that leverage the strengths of each of these approaches and benchmark the resulting GPs on multivariate time series (TS) classification datasets.} }
Endnote
%0 Conference Paper %T Bayesian Learning from Sequential Data using Gaussian Processes with Signature Covariances %A Csaba Toth %A Harald Oberhauser %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-toth20a %I PMLR %P 9548--9560 %U https://proceedings.mlr.press/v119/toth20a.html %V 119 %X We develop a Bayesian approach to learning from sequential data by using Gaussian processes (GPs) with so-called signature kernels as covariance functions. This allows to make sequences of different length comparable and to rely on strong theoretical results from stochastic analysis. Signatures capture sequential structure with tensors that can scale unfavourably in sequence length and state space dimension. To deal with this, we introduce a sparse variational approach with inducing tensors. We then combine the resulting GP with LSTMs and GRUs to build larger models that leverage the strengths of each of these approaches and benchmark the resulting GPs on multivariate time series (TS) classification datasets.
APA
Toth, C. & Oberhauser, H.. (2020). Bayesian Learning from Sequential Data using Gaussian Processes with Signature Covariances. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:9548-9560 Available from https://proceedings.mlr.press/v119/toth20a.html.

Related Material