Support recovery and sup-norm convergence rates for sparse pivotal estimation

[edit]

Mathurin Massias, Quentin Bertrand, Alexandre Gramfort, Joseph Salmon ;
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:2655-2665, 2020.

Abstract

In high dimensional sparse regression, pivotal estimators are estimators for which the optimal regularization parameter is independent of the noise level. The canonical pivotal estimator is the square-root Lasso, formulated along with its derivatives as a “non-smooth + non-smooth” optimization problem. Modern techniques to solve these include smoothing the datafitting term, to benefit from fast efficient proximal algorithms. In this work we show minimax sup-norm convergence rates for non smoothed and smoothed, single task and multitask square-root Lasso-type estimators. Thanks to our theoretical analysis, we provide some guidelines on how to set the smoothing hyperparameter, and illustrate on synthetic data the interest of such guidelines.

Related Material