TACTiS: Transformer-Attentional Copulas for Time Series

Alexandre Drouin, Étienne Marcotte, Nicolas Chapados
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:5447-5493, 2022.

Abstract

The estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance. However, the practical utility of such estimates is limited by how accurately they quantify predictive uncertainty. In this work, we address the problem of estimating the joint predictive distribution of high-dimensional multivariate time series. We propose a versatile method, based on the transformer architecture, that estimates joint distributions using an attention-based decoder that provably learns to mimic the properties of non-parametric copulas. The resulting model has several desirable properties: it can scale to hundreds of time series, supports both forecasting and interpolation, can handle unaligned and non-uniformly sampled data, and can seamlessly adapt to missing data during training. We demonstrate these properties empirically and show that our model produces state-of-the-art predictions on multiple real-world datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-drouin22a, title = {{TACT}i{S}: Transformer-Attentional Copulas for Time Series}, author = {Drouin, Alexandre and Marcotte, \'Etienne and Chapados, Nicolas}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {5447--5493}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/drouin22a/drouin22a.pdf}, url = {https://proceedings.mlr.press/v162/drouin22a.html}, abstract = {The estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance. However, the practical utility of such estimates is limited by how accurately they quantify predictive uncertainty. In this work, we address the problem of estimating the joint predictive distribution of high-dimensional multivariate time series. We propose a versatile method, based on the transformer architecture, that estimates joint distributions using an attention-based decoder that provably learns to mimic the properties of non-parametric copulas. The resulting model has several desirable properties: it can scale to hundreds of time series, supports both forecasting and interpolation, can handle unaligned and non-uniformly sampled data, and can seamlessly adapt to missing data during training. We demonstrate these properties empirically and show that our model produces state-of-the-art predictions on multiple real-world datasets.} }
Endnote
%0 Conference Paper %T TACTiS: Transformer-Attentional Copulas for Time Series %A Alexandre Drouin %A Étienne Marcotte %A Nicolas Chapados %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-drouin22a %I PMLR %P 5447--5493 %U https://proceedings.mlr.press/v162/drouin22a.html %V 162 %X The estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance. However, the practical utility of such estimates is limited by how accurately they quantify predictive uncertainty. In this work, we address the problem of estimating the joint predictive distribution of high-dimensional multivariate time series. We propose a versatile method, based on the transformer architecture, that estimates joint distributions using an attention-based decoder that provably learns to mimic the properties of non-parametric copulas. The resulting model has several desirable properties: it can scale to hundreds of time series, supports both forecasting and interpolation, can handle unaligned and non-uniformly sampled data, and can seamlessly adapt to missing data during training. We demonstrate these properties empirically and show that our model produces state-of-the-art predictions on multiple real-world datasets.
APA
Drouin, A., Marcotte, É. & Chapados, N.. (2022). TACTiS: Transformer-Attentional Copulas for Time Series. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:5447-5493 Available from https://proceedings.mlr.press/v162/drouin22a.html.

Related Material