Learning Quantile Functions without Quantile Crossing for Distribution-free Time Series Forecasting

Youngsuk Park, Danielle Maddix, François-Xavier Aubet, Kelvin Kan, Jan Gasthaus, Yuyang Wang
Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, PMLR 151:8127-8150, 2022.

Abstract

Quantile regression is an effective technique to quantify uncertainty, fit challenging underlying distributions, and often provide full probabilistic predictions through joint learnings over multiple quantile levels. A common drawback of these joint quantile regressions, however, is quantile crossing, which violates the desirable monotone property of the conditional quantile function. In this work, we propose the Incremental (Spline) Quantile Functions I(S)QF, a flexible and efficient distribution-free quantile estimation framework that resolves quantile crossing with a simple neural network layer. Moreover, I(S)QF inter/extrapolate to predict arbitrary quantile levels that differ from the underlying training ones. Equipped with the analytical evaluation of the continuous ranked probability score of I(S)QF representations, we apply our methods to NN-based times series forecasting cases, where the savings of the expensive re-training costs for non-trained quantile levels is particularly significant. We also provide a generalization error analysis of our proposed approaches under the sequence-to-sequence setting. Lastly, extensive experiments demonstrate the improvement of consistency and accuracy errors over other baselines.

Cite this Paper


BibTeX
@InProceedings{pmlr-v151-park22a, title = { Learning Quantile Functions without Quantile Crossing for Distribution-free Time Series Forecasting }, author = {Park, Youngsuk and Maddix, Danielle and Aubet, Fran\c{c}ois-Xavier and Kan, Kelvin and Gasthaus, Jan and Wang, Yuyang}, booktitle = {Proceedings of The 25th International Conference on Artificial Intelligence and Statistics}, pages = {8127--8150}, year = {2022}, editor = {Camps-Valls, Gustau and Ruiz, Francisco J. R. and Valera, Isabel}, volume = {151}, series = {Proceedings of Machine Learning Research}, month = {28--30 Mar}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v151/park22a/park22a.pdf}, url = {https://proceedings.mlr.press/v151/park22a.html}, abstract = { Quantile regression is an effective technique to quantify uncertainty, fit challenging underlying distributions, and often provide full probabilistic predictions through joint learnings over multiple quantile levels. A common drawback of these joint quantile regressions, however, is quantile crossing, which violates the desirable monotone property of the conditional quantile function. In this work, we propose the Incremental (Spline) Quantile Functions I(S)QF, a flexible and efficient distribution-free quantile estimation framework that resolves quantile crossing with a simple neural network layer. Moreover, I(S)QF inter/extrapolate to predict arbitrary quantile levels that differ from the underlying training ones. Equipped with the analytical evaluation of the continuous ranked probability score of I(S)QF representations, we apply our methods to NN-based times series forecasting cases, where the savings of the expensive re-training costs for non-trained quantile levels is particularly significant. We also provide a generalization error analysis of our proposed approaches under the sequence-to-sequence setting. Lastly, extensive experiments demonstrate the improvement of consistency and accuracy errors over other baselines. } }
Endnote
%0 Conference Paper %T Learning Quantile Functions without Quantile Crossing for Distribution-free Time Series Forecasting %A Youngsuk Park %A Danielle Maddix %A François-Xavier Aubet %A Kelvin Kan %A Jan Gasthaus %A Yuyang Wang %B Proceedings of The 25th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2022 %E Gustau Camps-Valls %E Francisco J. R. Ruiz %E Isabel Valera %F pmlr-v151-park22a %I PMLR %P 8127--8150 %U https://proceedings.mlr.press/v151/park22a.html %V 151 %X Quantile regression is an effective technique to quantify uncertainty, fit challenging underlying distributions, and often provide full probabilistic predictions through joint learnings over multiple quantile levels. A common drawback of these joint quantile regressions, however, is quantile crossing, which violates the desirable monotone property of the conditional quantile function. In this work, we propose the Incremental (Spline) Quantile Functions I(S)QF, a flexible and efficient distribution-free quantile estimation framework that resolves quantile crossing with a simple neural network layer. Moreover, I(S)QF inter/extrapolate to predict arbitrary quantile levels that differ from the underlying training ones. Equipped with the analytical evaluation of the continuous ranked probability score of I(S)QF representations, we apply our methods to NN-based times series forecasting cases, where the savings of the expensive re-training costs for non-trained quantile levels is particularly significant. We also provide a generalization error analysis of our proposed approaches under the sequence-to-sequence setting. Lastly, extensive experiments demonstrate the improvement of consistency and accuracy errors over other baselines.
APA
Park, Y., Maddix, D., Aubet, F., Kan, K., Gasthaus, J. & Wang, Y.. (2022). Learning Quantile Functions without Quantile Crossing for Distribution-free Time Series Forecasting . Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 151:8127-8150 Available from https://proceedings.mlr.press/v151/park22a.html.

Related Material