HAR-former: Hybrid Transformer with an Adaptive Time-Frequency Representation Matrix for Long-Term Series Forecasting

Kenghao Zheng, Zi Long, Shuxin Wang
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:1036-1044, 2025.

Abstract

Time series forecasting is crucial across various fields such as economics, energy, transportation planning, and weather prediction. Nevertheless, accurately modeling real-world systems is challenging due to their inherent complexity and non-stationarity. Traditional methods, which often depend on high-dimensional embeddings, can obscure multivariate relationships and struggle with performance limitations, especially when handling complex temporal patterns. To address these issues, we propose HAR-former, a Hybrid Transformer with an Adaptive Time-Frequency Representation Matrix, which combines the strengths of Multi-Layer Perceptrons (MLPs) and Transformers to process trend and seasonal components, respectively. The HAR-former leverages a novel adaptive time-frequency representation matrix to bridge the gap between the time and frequency domains, allowing the model to capture both long-range dependencies and localized patterns. Extensive experimental evaluation on eight real-world benchmark datasets demonstrates that HAR-former outperforms existing state-of-the-art (SOTA) methods, establishing it as a robust solution for complex time series forecasting tasks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-zheng25b, title = {HAR-former: Hybrid Transformer with an Adaptive Time-Frequency Representation Matrix for Long-Term Series Forecasting}, author = {Zheng, Kenghao and Long, Zi and Wang, Shuxin}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {1036--1044}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/zheng25b/zheng25b.pdf}, url = {https://proceedings.mlr.press/v258/zheng25b.html}, abstract = {Time series forecasting is crucial across various fields such as economics, energy, transportation planning, and weather prediction. Nevertheless, accurately modeling real-world systems is challenging due to their inherent complexity and non-stationarity. Traditional methods, which often depend on high-dimensional embeddings, can obscure multivariate relationships and struggle with performance limitations, especially when handling complex temporal patterns. To address these issues, we propose HAR-former, a Hybrid Transformer with an Adaptive Time-Frequency Representation Matrix, which combines the strengths of Multi-Layer Perceptrons (MLPs) and Transformers to process trend and seasonal components, respectively. The HAR-former leverages a novel adaptive time-frequency representation matrix to bridge the gap between the time and frequency domains, allowing the model to capture both long-range dependencies and localized patterns. Extensive experimental evaluation on eight real-world benchmark datasets demonstrates that HAR-former outperforms existing state-of-the-art (SOTA) methods, establishing it as a robust solution for complex time series forecasting tasks.} }
Endnote
%0 Conference Paper %T HAR-former: Hybrid Transformer with an Adaptive Time-Frequency Representation Matrix for Long-Term Series Forecasting %A Kenghao Zheng %A Zi Long %A Shuxin Wang %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-zheng25b %I PMLR %P 1036--1044 %U https://proceedings.mlr.press/v258/zheng25b.html %V 258 %X Time series forecasting is crucial across various fields such as economics, energy, transportation planning, and weather prediction. Nevertheless, accurately modeling real-world systems is challenging due to their inherent complexity and non-stationarity. Traditional methods, which often depend on high-dimensional embeddings, can obscure multivariate relationships and struggle with performance limitations, especially when handling complex temporal patterns. To address these issues, we propose HAR-former, a Hybrid Transformer with an Adaptive Time-Frequency Representation Matrix, which combines the strengths of Multi-Layer Perceptrons (MLPs) and Transformers to process trend and seasonal components, respectively. The HAR-former leverages a novel adaptive time-frequency representation matrix to bridge the gap between the time and frequency domains, allowing the model to capture both long-range dependencies and localized patterns. Extensive experimental evaluation on eight real-world benchmark datasets demonstrates that HAR-former outperforms existing state-of-the-art (SOTA) methods, establishing it as a robust solution for complex time series forecasting tasks.
APA
Zheng, K., Long, Z. & Wang, S.. (2025). HAR-former: Hybrid Transformer with an Adaptive Time-Frequency Representation Matrix for Long-Term Series Forecasting. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:1036-1044 Available from https://proceedings.mlr.press/v258/zheng25b.html.

Related Material