TimeBase: The Power of Minimalism in Efficient Long-term Time Series Forecasting

Qihe Huang, Zhengyang Zhou, Kuo Yang, Zhongchao Yi, Xu Wang, Yang Wang
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:26227-26246, 2025.

Abstract

Long-term time series forecasting (LTSF) has traditionally relied on large parameters to capture extended temporal dependencies, resulting in substantial computational costs and inefficiencies in both memory usage and processing time. However, time series data, unlike high-dimensional images or text, often exhibit temporal pattern similarity and low-rank structures, especially in long-term horizons. By leveraging this structure, models can be guided to focus on more essential, concise temporal data, improving both accuracy and computational efficiency. In this paper, we introduce TimeBase, an ultra-lightweight network to harness the power of minimalism in LTSF. TimeBase 1) extracts core basis temporal components and 2) transforms traditional point-level forecasting into efficient segment-level forecasting, achieving optimal utilization of both data and parameters. Extensive experiments on diverse real-world datasets show that TimeBase achieves remarkable efficiency and secures competitive forecasting performance. Additionally, TimeBase can also serve as a very effective plug-and-play complexity reducer for any patch-based forecasting models. Code is available at https://github.com/hqh0728/TimeBase.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-huang25az, title = {{T}ime{B}ase: The Power of Minimalism in Efficient Long-term Time Series Forecasting}, author = {Huang, Qihe and Zhou, Zhengyang and Yang, Kuo and Yi, Zhongchao and Wang, Xu and Wang, Yang}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {26227--26246}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/huang25az/huang25az.pdf}, url = {https://proceedings.mlr.press/v267/huang25az.html}, abstract = {Long-term time series forecasting (LTSF) has traditionally relied on large parameters to capture extended temporal dependencies, resulting in substantial computational costs and inefficiencies in both memory usage and processing time. However, time series data, unlike high-dimensional images or text, often exhibit temporal pattern similarity and low-rank structures, especially in long-term horizons. By leveraging this structure, models can be guided to focus on more essential, concise temporal data, improving both accuracy and computational efficiency. In this paper, we introduce TimeBase, an ultra-lightweight network to harness the power of minimalism in LTSF. TimeBase 1) extracts core basis temporal components and 2) transforms traditional point-level forecasting into efficient segment-level forecasting, achieving optimal utilization of both data and parameters. Extensive experiments on diverse real-world datasets show that TimeBase achieves remarkable efficiency and secures competitive forecasting performance. Additionally, TimeBase can also serve as a very effective plug-and-play complexity reducer for any patch-based forecasting models. Code is available at https://github.com/hqh0728/TimeBase.} }
Endnote
%0 Conference Paper %T TimeBase: The Power of Minimalism in Efficient Long-term Time Series Forecasting %A Qihe Huang %A Zhengyang Zhou %A Kuo Yang %A Zhongchao Yi %A Xu Wang %A Yang Wang %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-huang25az %I PMLR %P 26227--26246 %U https://proceedings.mlr.press/v267/huang25az.html %V 267 %X Long-term time series forecasting (LTSF) has traditionally relied on large parameters to capture extended temporal dependencies, resulting in substantial computational costs and inefficiencies in both memory usage and processing time. However, time series data, unlike high-dimensional images or text, often exhibit temporal pattern similarity and low-rank structures, especially in long-term horizons. By leveraging this structure, models can be guided to focus on more essential, concise temporal data, improving both accuracy and computational efficiency. In this paper, we introduce TimeBase, an ultra-lightweight network to harness the power of minimalism in LTSF. TimeBase 1) extracts core basis temporal components and 2) transforms traditional point-level forecasting into efficient segment-level forecasting, achieving optimal utilization of both data and parameters. Extensive experiments on diverse real-world datasets show that TimeBase achieves remarkable efficiency and secures competitive forecasting performance. Additionally, TimeBase can also serve as a very effective plug-and-play complexity reducer for any patch-based forecasting models. Code is available at https://github.com/hqh0728/TimeBase.
APA
Huang, Q., Zhou, Z., Yang, K., Yi, Z., Wang, X. & Wang, Y.. (2025). TimeBase: The Power of Minimalism in Efficient Long-term Time Series Forecasting. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:26227-26246 Available from https://proceedings.mlr.press/v267/huang25az.html.

Related Material