End-to-End Learning of Coherent Probabilistic Forecasts for Hierarchical Time Series

Syama Sundar Rangapuram, Lucien D Werner, Konstantinos Benidis, Pedro Mercado, Jan Gasthaus, Tim Januschowski
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:8832-8843, 2021.

Abstract

This paper presents a novel approach for hierarchical time series forecasting that produces coherent, probabilistic forecasts without requiring any explicit post-processing reconciliation. Unlike the state-of-the-art, the proposed method simultaneously learns from all time series in the hierarchy and incorporates the reconciliation step into a single trainable model. This is achieved by applying the reparameterization trick and casting reconciliation as an optimization problem with a closed-form solution. These model features make end-to-end learning of hierarchical forecasts possible, while accomplishing the challenging task of generating forecasts that are both probabilistic and coherent. Importantly, our approach also accommodates general aggregation constraints including grouped and temporal hierarchies. An extensive empirical evaluation on real-world hierarchical datasets demonstrates the advantages of the proposed approach over the state-of-the-art.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-rangapuram21a, title = {End-to-End Learning of Coherent Probabilistic Forecasts for Hierarchical Time Series}, author = {Rangapuram, Syama Sundar and Werner, Lucien D and Benidis, Konstantinos and Mercado, Pedro and Gasthaus, Jan and Januschowski, Tim}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {8832--8843}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/rangapuram21a/rangapuram21a.pdf}, url = {https://proceedings.mlr.press/v139/rangapuram21a.html}, abstract = {This paper presents a novel approach for hierarchical time series forecasting that produces coherent, probabilistic forecasts without requiring any explicit post-processing reconciliation. Unlike the state-of-the-art, the proposed method simultaneously learns from all time series in the hierarchy and incorporates the reconciliation step into a single trainable model. This is achieved by applying the reparameterization trick and casting reconciliation as an optimization problem with a closed-form solution. These model features make end-to-end learning of hierarchical forecasts possible, while accomplishing the challenging task of generating forecasts that are both probabilistic and coherent. Importantly, our approach also accommodates general aggregation constraints including grouped and temporal hierarchies. An extensive empirical evaluation on real-world hierarchical datasets demonstrates the advantages of the proposed approach over the state-of-the-art.} }
Endnote
%0 Conference Paper %T End-to-End Learning of Coherent Probabilistic Forecasts for Hierarchical Time Series %A Syama Sundar Rangapuram %A Lucien D Werner %A Konstantinos Benidis %A Pedro Mercado %A Jan Gasthaus %A Tim Januschowski %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-rangapuram21a %I PMLR %P 8832--8843 %U https://proceedings.mlr.press/v139/rangapuram21a.html %V 139 %X This paper presents a novel approach for hierarchical time series forecasting that produces coherent, probabilistic forecasts without requiring any explicit post-processing reconciliation. Unlike the state-of-the-art, the proposed method simultaneously learns from all time series in the hierarchy and incorporates the reconciliation step into a single trainable model. This is achieved by applying the reparameterization trick and casting reconciliation as an optimization problem with a closed-form solution. These model features make end-to-end learning of hierarchical forecasts possible, while accomplishing the challenging task of generating forecasts that are both probabilistic and coherent. Importantly, our approach also accommodates general aggregation constraints including grouped and temporal hierarchies. An extensive empirical evaluation on real-world hierarchical datasets demonstrates the advantages of the proposed approach over the state-of-the-art.
APA
Rangapuram, S.S., Werner, L.D., Benidis, K., Mercado, P., Gasthaus, J. & Januschowski, T.. (2021). End-to-End Learning of Coherent Probabilistic Forecasts for Hierarchical Time Series. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:8832-8843 Available from https://proceedings.mlr.press/v139/rangapuram21a.html.

Related Material