Local Risk Bounds for Statistical Aggregation

Jaouad Mourtada, Tomas Vaškevičius, Nikita Zhivotovskiy
Proceedings of Thirty Sixth Conference on Learning Theory, PMLR 195:5697-5698, 2023.

Abstract

In the problem of aggregation, the aim is to combine a given class of base predictors to achieve predictions nearly as accurate as the best one. In this flexible framework, no assumption is made on the structure of the class or the nature of the target. Aggregation has been studied in both sequential and statistical contexts. Despite some important differences between the two problems, the classical results in both cases feature the same global complexity measure. In this paper, we revisit and tighten classical results in the theory of aggregation in the statistical setting by replacing the global complexity with a smaller, local one. Some of our proofs build on the PAC-Bayes localization technique introduced by Catoni. Among other results, we prove localized versions of the classical bound for the exponential weights estimator due to Leung and Barron and deviation-optimal bounds for the Q-aggregation estimator. These bounds improve over the results of Dai, Rigollet and Zhang for fixed design regression and the results of Lecué and Rigollet for random design regression.

Cite this Paper


BibTeX
@InProceedings{pmlr-v195-mourtada23a, title = {Local Risk Bounds for Statistical Aggregation}, author = {Mourtada, Jaouad and Va{\v{s}}kevi{\v{c}}ius, Tomas and Zhivotovskiy, Nikita}, booktitle = {Proceedings of Thirty Sixth Conference on Learning Theory}, pages = {5697--5698}, year = {2023}, editor = {Neu, Gergely and Rosasco, Lorenzo}, volume = {195}, series = {Proceedings of Machine Learning Research}, month = {12--15 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v195/mourtada23a/mourtada23a.pdf}, url = {https://proceedings.mlr.press/v195/mourtada23a.html}, abstract = {In the problem of aggregation, the aim is to combine a given class of base predictors to achieve predictions nearly as accurate as the best one. In this flexible framework, no assumption is made on the structure of the class or the nature of the target. Aggregation has been studied in both sequential and statistical contexts. Despite some important differences between the two problems, the classical results in both cases feature the same global complexity measure. In this paper, we revisit and tighten classical results in the theory of aggregation in the statistical setting by replacing the global complexity with a smaller, local one. Some of our proofs build on the PAC-Bayes localization technique introduced by Catoni. Among other results, we prove localized versions of the classical bound for the exponential weights estimator due to Leung and Barron and deviation-optimal bounds for the Q-aggregation estimator. These bounds improve over the results of Dai, Rigollet and Zhang for fixed design regression and the results of Lecué and Rigollet for random design regression.} }
Endnote
%0 Conference Paper %T Local Risk Bounds for Statistical Aggregation %A Jaouad Mourtada %A Tomas Vaškevičius %A Nikita Zhivotovskiy %B Proceedings of Thirty Sixth Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2023 %E Gergely Neu %E Lorenzo Rosasco %F pmlr-v195-mourtada23a %I PMLR %P 5697--5698 %U https://proceedings.mlr.press/v195/mourtada23a.html %V 195 %X In the problem of aggregation, the aim is to combine a given class of base predictors to achieve predictions nearly as accurate as the best one. In this flexible framework, no assumption is made on the structure of the class or the nature of the target. Aggregation has been studied in both sequential and statistical contexts. Despite some important differences between the two problems, the classical results in both cases feature the same global complexity measure. In this paper, we revisit and tighten classical results in the theory of aggregation in the statistical setting by replacing the global complexity with a smaller, local one. Some of our proofs build on the PAC-Bayes localization technique introduced by Catoni. Among other results, we prove localized versions of the classical bound for the exponential weights estimator due to Leung and Barron and deviation-optimal bounds for the Q-aggregation estimator. These bounds improve over the results of Dai, Rigollet and Zhang for fixed design regression and the results of Lecué and Rigollet for random design regression.
APA
Mourtada, J., Vaškevičius, T. & Zhivotovskiy, N.. (2023). Local Risk Bounds for Statistical Aggregation. Proceedings of Thirty Sixth Conference on Learning Theory, in Proceedings of Machine Learning Research 195:5697-5698 Available from https://proceedings.mlr.press/v195/mourtada23a.html.

Related Material