MF-CLR: Multi-Frequency Contrastive Learning Representation for Time Series

Jufang Duan, Wei Zheng, Yangzhou Du, Wenfa Wu, Haipeng Jiang, Hongsheng Qi
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:11918-11939, 2024.

Abstract

Learning a decent representation from unlabeled time series is a challenging task, especially when the time series data is derived from diverse channels at different sampling rates. Our motivation stems from the financial domain, where sparsely labeled covariates are commonly collected at different frequencies, e.g., daily stock market index, monthly unemployment rate and quarterly net revenue of a certain listed corporation. This paper presents Multi-Frequency Contrastive Learning Representation (MF-CLR), aimed at learning a good representation of multi-frequency time series in a self-supervised paradigm by leveraging the ability of contrastive learning. MF-CLR introduces a hierarchical mechanism that spans across different frequencies along the feature dimension. Within each contrastive block, two groups of subseries with adjacent frequencies are embedded based on our proposed cross-frequency consistency. To validate the effectiveness of MF-CLR, we conduct extensive experiments on five downstream tasks, including long-term and short-term forecasting, classification, anomaly detection and imputation. Experimental evidence shows that MF-CLR delivers a leading performance in all the downstream tasks and keeps consistent performance across different target dataset scales in the transfer learning scenario.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-duan24b, title = {{MF}-{CLR}: Multi-Frequency Contrastive Learning Representation for Time Series}, author = {Duan, Jufang and Zheng, Wei and Du, Yangzhou and Wu, Wenfa and Jiang, Haipeng and Qi, Hongsheng}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {11918--11939}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/duan24b/duan24b.pdf}, url = {https://proceedings.mlr.press/v235/duan24b.html}, abstract = {Learning a decent representation from unlabeled time series is a challenging task, especially when the time series data is derived from diverse channels at different sampling rates. Our motivation stems from the financial domain, where sparsely labeled covariates are commonly collected at different frequencies, e.g., daily stock market index, monthly unemployment rate and quarterly net revenue of a certain listed corporation. This paper presents Multi-Frequency Contrastive Learning Representation (MF-CLR), aimed at learning a good representation of multi-frequency time series in a self-supervised paradigm by leveraging the ability of contrastive learning. MF-CLR introduces a hierarchical mechanism that spans across different frequencies along the feature dimension. Within each contrastive block, two groups of subseries with adjacent frequencies are embedded based on our proposed cross-frequency consistency. To validate the effectiveness of MF-CLR, we conduct extensive experiments on five downstream tasks, including long-term and short-term forecasting, classification, anomaly detection and imputation. Experimental evidence shows that MF-CLR delivers a leading performance in all the downstream tasks and keeps consistent performance across different target dataset scales in the transfer learning scenario.} }
Endnote
%0 Conference Paper %T MF-CLR: Multi-Frequency Contrastive Learning Representation for Time Series %A Jufang Duan %A Wei Zheng %A Yangzhou Du %A Wenfa Wu %A Haipeng Jiang %A Hongsheng Qi %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-duan24b %I PMLR %P 11918--11939 %U https://proceedings.mlr.press/v235/duan24b.html %V 235 %X Learning a decent representation from unlabeled time series is a challenging task, especially when the time series data is derived from diverse channels at different sampling rates. Our motivation stems from the financial domain, where sparsely labeled covariates are commonly collected at different frequencies, e.g., daily stock market index, monthly unemployment rate and quarterly net revenue of a certain listed corporation. This paper presents Multi-Frequency Contrastive Learning Representation (MF-CLR), aimed at learning a good representation of multi-frequency time series in a self-supervised paradigm by leveraging the ability of contrastive learning. MF-CLR introduces a hierarchical mechanism that spans across different frequencies along the feature dimension. Within each contrastive block, two groups of subseries with adjacent frequencies are embedded based on our proposed cross-frequency consistency. To validate the effectiveness of MF-CLR, we conduct extensive experiments on five downstream tasks, including long-term and short-term forecasting, classification, anomaly detection and imputation. Experimental evidence shows that MF-CLR delivers a leading performance in all the downstream tasks and keeps consistent performance across different target dataset scales in the transfer learning scenario.
APA
Duan, J., Zheng, W., Du, Y., Wu, W., Jiang, H. & Qi, H.. (2024). MF-CLR: Multi-Frequency Contrastive Learning Representation for Time Series. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:11918-11939 Available from https://proceedings.mlr.press/v235/duan24b.html.

Related Material