Decoupling Local and Global Representations of Time Series

Sana Tonekaboni, Chun-Liang Li, Sercan O. Arik, Anna Goldenberg, Tomas Pfister
Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, PMLR 151:8700-8714, 2022.

Abstract

Real-world time series data are often generated from several sources of variation. Learning representations that capture the factors contributing to this variability enables better understanding of the data via its underlying generative process and can lead to improvements in performance on downstream machine learning tasks. In this paper, we propose a novel generative approach for learning representations for the global and local factors of variation in time series data. The local representation of each sample models non-stationarity over time with a stochastic process prior, and the global representation of the sample encodes the time-independent characteristics. To encourage decoupling between the representations, we introduce a counterfactual regularization that minimizes the mutual information between the two variables. In experiments, we demonstrate successful recovery of the true local and global factors of variability on simulated data, and show that representations learned using our method lead to superior performance on downstream tasks on real-world datasets. We believe that the proposed way of defining representations is beneficial for data modelling and can yield better insights into the complexity of the real-world data.

Cite this Paper


BibTeX
@InProceedings{pmlr-v151-tonekaboni22a, title = { Decoupling Local and Global Representations of Time Series }, author = {Tonekaboni, Sana and Li, Chun-Liang and Arik, Sercan O. and Goldenberg, Anna and Pfister, Tomas}, booktitle = {Proceedings of The 25th International Conference on Artificial Intelligence and Statistics}, pages = {8700--8714}, year = {2022}, editor = {Camps-Valls, Gustau and Ruiz, Francisco J. R. and Valera, Isabel}, volume = {151}, series = {Proceedings of Machine Learning Research}, month = {28--30 Mar}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v151/tonekaboni22a/tonekaboni22a.pdf}, url = {https://proceedings.mlr.press/v151/tonekaboni22a.html}, abstract = { Real-world time series data are often generated from several sources of variation. Learning representations that capture the factors contributing to this variability enables better understanding of the data via its underlying generative process and can lead to improvements in performance on downstream machine learning tasks. In this paper, we propose a novel generative approach for learning representations for the global and local factors of variation in time series data. The local representation of each sample models non-stationarity over time with a stochastic process prior, and the global representation of the sample encodes the time-independent characteristics. To encourage decoupling between the representations, we introduce a counterfactual regularization that minimizes the mutual information between the two variables. In experiments, we demonstrate successful recovery of the true local and global factors of variability on simulated data, and show that representations learned using our method lead to superior performance on downstream tasks on real-world datasets. We believe that the proposed way of defining representations is beneficial for data modelling and can yield better insights into the complexity of the real-world data. } }
Endnote
%0 Conference Paper %T Decoupling Local and Global Representations of Time Series %A Sana Tonekaboni %A Chun-Liang Li %A Sercan O. Arik %A Anna Goldenberg %A Tomas Pfister %B Proceedings of The 25th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2022 %E Gustau Camps-Valls %E Francisco J. R. Ruiz %E Isabel Valera %F pmlr-v151-tonekaboni22a %I PMLR %P 8700--8714 %U https://proceedings.mlr.press/v151/tonekaboni22a.html %V 151 %X Real-world time series data are often generated from several sources of variation. Learning representations that capture the factors contributing to this variability enables better understanding of the data via its underlying generative process and can lead to improvements in performance on downstream machine learning tasks. In this paper, we propose a novel generative approach for learning representations for the global and local factors of variation in time series data. The local representation of each sample models non-stationarity over time with a stochastic process prior, and the global representation of the sample encodes the time-independent characteristics. To encourage decoupling between the representations, we introduce a counterfactual regularization that minimizes the mutual information between the two variables. In experiments, we demonstrate successful recovery of the true local and global factors of variability on simulated data, and show that representations learned using our method lead to superior performance on downstream tasks on real-world datasets. We believe that the proposed way of defining representations is beneficial for data modelling and can yield better insights into the complexity of the real-world data.
APA
Tonekaboni, S., Li, C., Arik, S.O., Goldenberg, A. & Pfister, T.. (2022). Decoupling Local and Global Representations of Time Series . Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 151:8700-8714 Available from https://proceedings.mlr.press/v151/tonekaboni22a.html.

Related Material