Deep Non-crossing Quantiles through the Partial Derivative

Axel Brando, Barcelona Supercomputing Center, )*; and Joan Gimeno, Jose Rodriguez-Serrano, Jordi Vitria
Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, PMLR 151:7902-7914, 2022.

Abstract

Quantile Regression (QR) provides a way to approximate a single conditional quantile. To have a more informative description of the conditional distribution, QR can be merged with deep learning techniques to simultaneously estimate multiple quantiles. However, the minimisation of the QR-loss function does not guarantee non-crossing quantiles, which affects the validity of such predictions and introduces a critical issue in certain scenarios. In this article, we propose a generic deep learning algorithm for predicting an arbitrary number of quantiles that ensures the quantile monotonicity constraint up to the machine precision and maintains its modelling performance with respect to alternative models. The presented method is evaluated over several real-world datasets obtaining state-of-the-art results as well as showing that it scales to large-size data sets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v151-brando22a, title = { Deep Non-crossing Quantiles through the Partial Derivative }, author = {Brando, Axel and Center, Barcelona Supercomputing and and Joan Gimeno, )*; and Rodriguez-Serrano, Jose and Vitria, Jordi}, booktitle = {Proceedings of The 25th International Conference on Artificial Intelligence and Statistics}, pages = {7902--7914}, year = {2022}, editor = {Camps-Valls, Gustau and Ruiz, Francisco J. R. and Valera, Isabel}, volume = {151}, series = {Proceedings of Machine Learning Research}, month = {28--30 Mar}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v151/brando22a/brando22a.pdf}, url = {https://proceedings.mlr.press/v151/brando22a.html}, abstract = { Quantile Regression (QR) provides a way to approximate a single conditional quantile. To have a more informative description of the conditional distribution, QR can be merged with deep learning techniques to simultaneously estimate multiple quantiles. However, the minimisation of the QR-loss function does not guarantee non-crossing quantiles, which affects the validity of such predictions and introduces a critical issue in certain scenarios. In this article, we propose a generic deep learning algorithm for predicting an arbitrary number of quantiles that ensures the quantile monotonicity constraint up to the machine precision and maintains its modelling performance with respect to alternative models. The presented method is evaluated over several real-world datasets obtaining state-of-the-art results as well as showing that it scales to large-size data sets. } }
Endnote
%0 Conference Paper %T Deep Non-crossing Quantiles through the Partial Derivative %A Axel Brando %A Barcelona Supercomputing Center %A )*; and Joan Gimeno %A Jose Rodriguez-Serrano %A Jordi Vitria %B Proceedings of The 25th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2022 %E Gustau Camps-Valls %E Francisco J. R. Ruiz %E Isabel Valera %F pmlr-v151-brando22a %I PMLR %P 7902--7914 %U https://proceedings.mlr.press/v151/brando22a.html %V 151 %X Quantile Regression (QR) provides a way to approximate a single conditional quantile. To have a more informative description of the conditional distribution, QR can be merged with deep learning techniques to simultaneously estimate multiple quantiles. However, the minimisation of the QR-loss function does not guarantee non-crossing quantiles, which affects the validity of such predictions and introduces a critical issue in certain scenarios. In this article, we propose a generic deep learning algorithm for predicting an arbitrary number of quantiles that ensures the quantile monotonicity constraint up to the machine precision and maintains its modelling performance with respect to alternative models. The presented method is evaluated over several real-world datasets obtaining state-of-the-art results as well as showing that it scales to large-size data sets.
APA
Brando, A., Center, B.S., and Joan Gimeno, )., Rodriguez-Serrano, J. & Vitria, J.. (2022). Deep Non-crossing Quantiles through the Partial Derivative . Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 151:7902-7914 Available from https://proceedings.mlr.press/v151/brando22a.html.

Related Material