Time Series Representations with Hard-Coded Invariances

Thibaut Germain, Chrysoula Kosma, Laurent Oudre
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:19172-19195, 2025.

Abstract

Automatically extracting robust representations from large and complex time series data is becoming imperative for several real-world applications. Unfortunately, the potential of common neural network architectures in capturing invariant properties of time series remains relatively underexplored. For instance, convolutional layers often fail to capture underlying patterns in time series inputs that encompass strong deformations, such as trends. Indeed, invariances to some deformations may be critical for solving complex time series tasks, such as classification, while guaranteeing good generalization performance. To address these challenges, we mathematically formulate and technically design efficient and hard-coded invariant convolutions for specific group actions applicable to the case of time series. We construct these convolutions by considering specific sets of deformations commonly observed in time series, including scaling, offset shift, and trend. We further combine the proposed invariant convolutions with standard convolutions in single embedding layers, and we showcase the layer capacity to capture complex invariant time series properties in several scenarios.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-germain25a, title = {Time Series Representations with Hard-Coded Invariances}, author = {Germain, Thibaut and Kosma, Chrysoula and Oudre, Laurent}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {19172--19195}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/germain25a/germain25a.pdf}, url = {https://proceedings.mlr.press/v267/germain25a.html}, abstract = {Automatically extracting robust representations from large and complex time series data is becoming imperative for several real-world applications. Unfortunately, the potential of common neural network architectures in capturing invariant properties of time series remains relatively underexplored. For instance, convolutional layers often fail to capture underlying patterns in time series inputs that encompass strong deformations, such as trends. Indeed, invariances to some deformations may be critical for solving complex time series tasks, such as classification, while guaranteeing good generalization performance. To address these challenges, we mathematically formulate and technically design efficient and hard-coded invariant convolutions for specific group actions applicable to the case of time series. We construct these convolutions by considering specific sets of deformations commonly observed in time series, including scaling, offset shift, and trend. We further combine the proposed invariant convolutions with standard convolutions in single embedding layers, and we showcase the layer capacity to capture complex invariant time series properties in several scenarios.} }
Endnote
%0 Conference Paper %T Time Series Representations with Hard-Coded Invariances %A Thibaut Germain %A Chrysoula Kosma %A Laurent Oudre %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-germain25a %I PMLR %P 19172--19195 %U https://proceedings.mlr.press/v267/germain25a.html %V 267 %X Automatically extracting robust representations from large and complex time series data is becoming imperative for several real-world applications. Unfortunately, the potential of common neural network architectures in capturing invariant properties of time series remains relatively underexplored. For instance, convolutional layers often fail to capture underlying patterns in time series inputs that encompass strong deformations, such as trends. Indeed, invariances to some deformations may be critical for solving complex time series tasks, such as classification, while guaranteeing good generalization performance. To address these challenges, we mathematically formulate and technically design efficient and hard-coded invariant convolutions for specific group actions applicable to the case of time series. We construct these convolutions by considering specific sets of deformations commonly observed in time series, including scaling, offset shift, and trend. We further combine the proposed invariant convolutions with standard convolutions in single embedding layers, and we showcase the layer capacity to capture complex invariant time series properties in several scenarios.
APA
Germain, T., Kosma, C. & Oudre, L.. (2025). Time Series Representations with Hard-Coded Invariances. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:19172-19195 Available from https://proceedings.mlr.press/v267/germain25a.html.

Related Material