Optimal aggregation of affine estimators

[edit]

Joseph Salmon, Arnak Dalalyan ;
Proceedings of the 24th Annual Conference on Learning Theory, PMLR 19:635-660, 2011.

Abstract

We consider the problem of combining a (possibly uncountably infinite) set of affine estimators innon-parametric regression model with heteroscedastic Gaussian noise. Focusing on the exponentially weighted aggregate, we prove a PAC-Bayesian type inequality that leadsto sharp oracle inequalities in discrete but also in continuous settings. The framework is general enough to cover the combinations of various procedures such asleast square regression, kernel ridge regression, shrinking estimators and many other estimators used in the literature on statistical inverse problems. As aconsequence, we show that the proposed aggregateprovides an adaptive estimator in the exact minimax sense without neither discretizing the range of tuningparameters nor splitting the set of observations. We also illustrate numerically the good performance achievedby the exponentially weighted aggregate.

Related Material