Quantile Additive Trend Filtering

Zhi Zhang, Kyle Ritscher, OSCAR HERNAN MADRID PADILLA
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:4735-4743, 2025.

Abstract

This paper introduces and analyzes quantile additive trend filtering, a novel approach to model the conditional quantiles of the response variable given multivariate covariates. Under the assumption that the true model is additive, and that the components are functions whose $r$th order weak derivatives have bounded total variation, our estimator is a constrained version of quantile trend filtering within additive models. The primary theoretical contributions are the error rate of our estimator in both fixed and growing input dimensions. In the fixed dimension case, we show that our estimator attains a rate that mirrors the non-quantile minimax rate for additive trend filtering, featuring the main term $n^{-2r/(2r+1)}$. For growing input dimension ($d$), our rate has an additional polynomial factor $d^{(2r+2)/(2r+1)}$. We propose a practical algorithm for implementing quantile additive trend filtering using dimension-wise backfitting. Experiments in both real data and simulations confirm our theoretical findings. We provide a public implementation of the algorithm at \url{https://github.com/zzh237/QATF.}

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-zhang25k, title = {Quantile Additive Trend Filtering}, author = {Zhang, Zhi and Ritscher, Kyle and PADILLA, OSCAR HERNAN MADRID}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {4735--4743}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/zhang25k/zhang25k.pdf}, url = {https://proceedings.mlr.press/v258/zhang25k.html}, abstract = {This paper introduces and analyzes quantile additive trend filtering, a novel approach to model the conditional quantiles of the response variable given multivariate covariates. Under the assumption that the true model is additive, and that the components are functions whose $r$th order weak derivatives have bounded total variation, our estimator is a constrained version of quantile trend filtering within additive models. The primary theoretical contributions are the error rate of our estimator in both fixed and growing input dimensions. In the fixed dimension case, we show that our estimator attains a rate that mirrors the non-quantile minimax rate for additive trend filtering, featuring the main term $n^{-2r/(2r+1)}$. For growing input dimension ($d$), our rate has an additional polynomial factor $d^{(2r+2)/(2r+1)}$. We propose a practical algorithm for implementing quantile additive trend filtering using dimension-wise backfitting. Experiments in both real data and simulations confirm our theoretical findings. We provide a public implementation of the algorithm at \url{https://github.com/zzh237/QATF.}} }
Endnote
%0 Conference Paper %T Quantile Additive Trend Filtering %A Zhi Zhang %A Kyle Ritscher %A OSCAR HERNAN MADRID PADILLA %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-zhang25k %I PMLR %P 4735--4743 %U https://proceedings.mlr.press/v258/zhang25k.html %V 258 %X This paper introduces and analyzes quantile additive trend filtering, a novel approach to model the conditional quantiles of the response variable given multivariate covariates. Under the assumption that the true model is additive, and that the components are functions whose $r$th order weak derivatives have bounded total variation, our estimator is a constrained version of quantile trend filtering within additive models. The primary theoretical contributions are the error rate of our estimator in both fixed and growing input dimensions. In the fixed dimension case, we show that our estimator attains a rate that mirrors the non-quantile minimax rate for additive trend filtering, featuring the main term $n^{-2r/(2r+1)}$. For growing input dimension ($d$), our rate has an additional polynomial factor $d^{(2r+2)/(2r+1)}$. We propose a practical algorithm for implementing quantile additive trend filtering using dimension-wise backfitting. Experiments in both real data and simulations confirm our theoretical findings. We provide a public implementation of the algorithm at \url{https://github.com/zzh237/QATF.}
APA
Zhang, Z., Ritscher, K. & PADILLA, O.H.M.. (2025). Quantile Additive Trend Filtering. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:4735-4743 Available from https://proceedings.mlr.press/v258/zhang25k.html.

Related Material