Minimax-optimal and Locally-adaptive Online Nonparametric Regression

Paul Liautaud, Pierre Gaillard, Olivier Wintenberger
Proceedings of The 36th International Conference on Algorithmic Learning Theory, PMLR 272:702-735, 2025.

Abstract

We study adversarial online nonparametric regression with general convex losses and propose a parameter-free learning algorithm that achieves minimax optimal rates. Our approach leverages chaining trees to compete against Hölder functions and establishes optimal regret bounds. While competing with nonparametric function classes can be challenging, they often exhibit local patterns - such as local Hölder continuity - that online algorithms can exploit. Without prior knowledge, our method dynamically tracks and adapts to different Hölder profiles by pruning a core chaining tree structure, aligning itself with local smoothness variations. This leads to the first computationally efficient algorithm with locally adaptive optimal rates for online regression in an adversarial setting. Finally, we discuss how these notions could be extended to a boosting framework, offering promising directions for future research.

Cite this Paper


BibTeX
@InProceedings{pmlr-v272-liautaud25a, title = {Minimax-optimal and Locally-adaptive Online Nonparametric Regression}, author = {Liautaud, Paul and Gaillard, Pierre and Wintenberger, Olivier}, booktitle = {Proceedings of The 36th International Conference on Algorithmic Learning Theory}, pages = {702--735}, year = {2025}, editor = {Kamath, Gautam and Loh, Po-Ling}, volume = {272}, series = {Proceedings of Machine Learning Research}, month = {24--27 Feb}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v272/main/assets/liautaud25a/liautaud25a.pdf}, url = {https://proceedings.mlr.press/v272/liautaud25a.html}, abstract = {We study adversarial online nonparametric regression with general convex losses and propose a parameter-free learning algorithm that achieves minimax optimal rates. Our approach leverages chaining trees to compete against Hölder functions and establishes optimal regret bounds. While competing with nonparametric function classes can be challenging, they often exhibit local patterns - such as local Hölder continuity - that online algorithms can exploit. Without prior knowledge, our method dynamically tracks and adapts to different Hölder profiles by pruning a core chaining tree structure, aligning itself with local smoothness variations. This leads to the first computationally efficient algorithm with locally adaptive optimal rates for online regression in an adversarial setting. Finally, we discuss how these notions could be extended to a boosting framework, offering promising directions for future research.} }
Endnote
%0 Conference Paper %T Minimax-optimal and Locally-adaptive Online Nonparametric Regression %A Paul Liautaud %A Pierre Gaillard %A Olivier Wintenberger %B Proceedings of The 36th International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2025 %E Gautam Kamath %E Po-Ling Loh %F pmlr-v272-liautaud25a %I PMLR %P 702--735 %U https://proceedings.mlr.press/v272/liautaud25a.html %V 272 %X We study adversarial online nonparametric regression with general convex losses and propose a parameter-free learning algorithm that achieves minimax optimal rates. Our approach leverages chaining trees to compete against Hölder functions and establishes optimal regret bounds. While competing with nonparametric function classes can be challenging, they often exhibit local patterns - such as local Hölder continuity - that online algorithms can exploit. Without prior knowledge, our method dynamically tracks and adapts to different Hölder profiles by pruning a core chaining tree structure, aligning itself with local smoothness variations. This leads to the first computationally efficient algorithm with locally adaptive optimal rates for online regression in an adversarial setting. Finally, we discuss how these notions could be extended to a boosting framework, offering promising directions for future research.
APA
Liautaud, P., Gaillard, P. & Wintenberger, O.. (2025). Minimax-optimal and Locally-adaptive Online Nonparametric Regression. Proceedings of The 36th International Conference on Algorithmic Learning Theory, in Proceedings of Machine Learning Research 272:702-735 Available from https://proceedings.mlr.press/v272/liautaud25a.html.

Related Material