Optimal aggregation of affine estimators

Joseph Salmon, Arnak Dalalyan
Proceedings of the 24th Annual Conference on Learning Theory, PMLR 19:635-660, 2011.

Abstract

We consider the problem of combining a (possibly uncountably infinite) set of affine estimators in non-parametric regression model with heteroscedastic Gaussian noise. Focusing on the exponentially weighted aggregate, we prove a PAC-Bayesian type inequality that leads to sharp oracle inequalities in discrete but also in continuous settings. The framework is general enough to cover the combinations of various procedures such as least square regression, kernel ridge regression, shrinking estimators and many other estimators used in the literature on statistical inverse problems. As a consequence, we show that the proposed aggregate provides an adaptive estimator in the exact minimax sense without neither discretizing the range of tuning parameters nor splitting the set of observations. We also illustrate numerically the good performance achieved by the exponentially weighted aggregate.

Cite this Paper


BibTeX
@InProceedings{pmlr-v19-salmon11a, title = {Optimal aggregation of affine estimators}, author = {Salmon, Joseph and Dalalyan, Arnak}, booktitle = {Proceedings of the 24th Annual Conference on Learning Theory}, pages = {635--660}, year = {2011}, editor = {Kakade, Sham M. and von Luxburg, Ulrike}, volume = {19}, series = {Proceedings of Machine Learning Research}, address = {Budapest, Hungary}, month = {09--11 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v19/salmon11a/salmon11a.pdf}, url = {https://proceedings.mlr.press/v19/salmon11a.html}, abstract = {We consider the problem of combining a (possibly uncountably infinite) set of affine estimators in non-parametric regression model with heteroscedastic Gaussian noise. Focusing on the exponentially weighted aggregate, we prove a PAC-Bayesian type inequality that leads to sharp oracle inequalities in discrete but also in continuous settings. The framework is general enough to cover the combinations of various procedures such as least square regression, kernel ridge regression, shrinking estimators and many other estimators used in the literature on statistical inverse problems. As a consequence, we show that the proposed aggregate provides an adaptive estimator in the exact minimax sense without neither discretizing the range of tuning parameters nor splitting the set of observations. We also illustrate numerically the good performance achieved by the exponentially weighted aggregate.} }
Endnote
%0 Conference Paper %T Optimal aggregation of affine estimators %A Joseph Salmon %A Arnak Dalalyan %B Proceedings of the 24th Annual Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2011 %E Sham M. Kakade %E Ulrike von Luxburg %F pmlr-v19-salmon11a %I PMLR %P 635--660 %U https://proceedings.mlr.press/v19/salmon11a.html %V 19 %X We consider the problem of combining a (possibly uncountably infinite) set of affine estimators in non-parametric regression model with heteroscedastic Gaussian noise. Focusing on the exponentially weighted aggregate, we prove a PAC-Bayesian type inequality that leads to sharp oracle inequalities in discrete but also in continuous settings. The framework is general enough to cover the combinations of various procedures such as least square regression, kernel ridge regression, shrinking estimators and many other estimators used in the literature on statistical inverse problems. As a consequence, we show that the proposed aggregate provides an adaptive estimator in the exact minimax sense without neither discretizing the range of tuning parameters nor splitting the set of observations. We also illustrate numerically the good performance achieved by the exponentially weighted aggregate.
RIS
TY - CPAPER TI - Optimal aggregation of affine estimators AU - Joseph Salmon AU - Arnak Dalalyan BT - Proceedings of the 24th Annual Conference on Learning Theory DA - 2011/12/21 ED - Sham M. Kakade ED - Ulrike von Luxburg ID - pmlr-v19-salmon11a PB - PMLR DP - Proceedings of Machine Learning Research VL - 19 SP - 635 EP - 660 L1 - http://proceedings.mlr.press/v19/salmon11a/salmon11a.pdf UR - https://proceedings.mlr.press/v19/salmon11a.html AB - We consider the problem of combining a (possibly uncountably infinite) set of affine estimators in non-parametric regression model with heteroscedastic Gaussian noise. Focusing on the exponentially weighted aggregate, we prove a PAC-Bayesian type inequality that leads to sharp oracle inequalities in discrete but also in continuous settings. The framework is general enough to cover the combinations of various procedures such as least square regression, kernel ridge regression, shrinking estimators and many other estimators used in the literature on statistical inverse problems. As a consequence, we show that the proposed aggregate provides an adaptive estimator in the exact minimax sense without neither discretizing the range of tuning parameters nor splitting the set of observations. We also illustrate numerically the good performance achieved by the exponentially weighted aggregate. ER -
APA
Salmon, J. & Dalalyan, A.. (2011). Optimal aggregation of affine estimators. Proceedings of the 24th Annual Conference on Learning Theory, in Proceedings of Machine Learning Research 19:635-660 Available from https://proceedings.mlr.press/v19/salmon11a.html.

Related Material