Nonparametric Distributional Regression via Quantile Regression

Cheng Peng, Stan Uryasev
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:4852-4860, 2025.

Abstract

This paper proposes a new approach to estimating the distribution of a response variable conditioned on factors. We model the conditional quantile function as a mixture (weighted sum) of basis quantile functions, with weights depending on these factors. The estimation problem is formulated as a convex optimization problem. The objective function is equivalent to the continuous ranked probability score (CRPS). This approach can be viewed as conducting quantile regressions for all confidence levels simultaneously while inherently avoiding quantile crossing. We use spline functions of factors as a primary example for the weight function. We prove an approximation property of the model. To address computational challenges, we propose a dimensionality reduction method using tensor decomposition and an alternating algorithm. Our approach offers flexibility, interpretability, tractability, and extendability. Numerical experiments demonstrate its effectiveness.

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-peng25a, title = {Nonparametric Distributional Regression via Quantile Regression}, author = {Peng, Cheng and Uryasev, Stan}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {4852--4860}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/peng25a/peng25a.pdf}, url = {https://proceedings.mlr.press/v258/peng25a.html}, abstract = {This paper proposes a new approach to estimating the distribution of a response variable conditioned on factors. We model the conditional quantile function as a mixture (weighted sum) of basis quantile functions, with weights depending on these factors. The estimation problem is formulated as a convex optimization problem. The objective function is equivalent to the continuous ranked probability score (CRPS). This approach can be viewed as conducting quantile regressions for all confidence levels simultaneously while inherently avoiding quantile crossing. We use spline functions of factors as a primary example for the weight function. We prove an approximation property of the model. To address computational challenges, we propose a dimensionality reduction method using tensor decomposition and an alternating algorithm. Our approach offers flexibility, interpretability, tractability, and extendability. Numerical experiments demonstrate its effectiveness.} }
Endnote
%0 Conference Paper %T Nonparametric Distributional Regression via Quantile Regression %A Cheng Peng %A Stan Uryasev %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-peng25a %I PMLR %P 4852--4860 %U https://proceedings.mlr.press/v258/peng25a.html %V 258 %X This paper proposes a new approach to estimating the distribution of a response variable conditioned on factors. We model the conditional quantile function as a mixture (weighted sum) of basis quantile functions, with weights depending on these factors. The estimation problem is formulated as a convex optimization problem. The objective function is equivalent to the continuous ranked probability score (CRPS). This approach can be viewed as conducting quantile regressions for all confidence levels simultaneously while inherently avoiding quantile crossing. We use spline functions of factors as a primary example for the weight function. We prove an approximation property of the model. To address computational challenges, we propose a dimensionality reduction method using tensor decomposition and an alternating algorithm. Our approach offers flexibility, interpretability, tractability, and extendability. Numerical experiments demonstrate its effectiveness.
APA
Peng, C. & Uryasev, S.. (2025). Nonparametric Distributional Regression via Quantile Regression. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:4852-4860 Available from https://proceedings.mlr.press/v258/peng25a.html.

Related Material