Support recovery and sup-norm convergence rates for sparse pivotal estimation

Mathurin Massias, Quentin Bertrand, Alexandre Gramfort, Joseph Salmon
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:2655-2665, 2020.

Abstract

In high dimensional sparse regression, pivotal estimators are estimators for which the optimal regularization parameter is independent of the noise level. The canonical pivotal estimator is the square-root Lasso, formulated along with its derivatives as a “non-smooth + non-smooth” optimization problem. Modern techniques to solve these include smoothing the datafitting term, to benefit from fast efficient proximal algorithms. In this work we show minimax sup-norm convergence rates for non smoothed and smoothed, single task and multitask square-root Lasso-type estimators. Thanks to our theoretical analysis, we provide some guidelines on how to set the smoothing hyperparameter, and illustrate on synthetic data the interest of such guidelines.

Cite this Paper


BibTeX
@InProceedings{pmlr-v108-massias20a, title = {Support recovery and sup-norm convergence rates for sparse pivotal estimation}, author = {Massias, Mathurin and Bertrand, Quentin and Gramfort, Alexandre and Salmon, Joseph}, booktitle = {Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics}, pages = {2655--2665}, year = {2020}, editor = {Chiappa, Silvia and Calandra, Roberto}, volume = {108}, series = {Proceedings of Machine Learning Research}, month = {26--28 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v108/massias20a/massias20a.pdf}, url = {https://proceedings.mlr.press/v108/massias20a.html}, abstract = {In high dimensional sparse regression, pivotal estimators are estimators for which the optimal regularization parameter is independent of the noise level. The canonical pivotal estimator is the square-root Lasso, formulated along with its derivatives as a “non-smooth + non-smooth” optimization problem. Modern techniques to solve these include smoothing the datafitting term, to benefit from fast efficient proximal algorithms. In this work we show minimax sup-norm convergence rates for non smoothed and smoothed, single task and multitask square-root Lasso-type estimators. Thanks to our theoretical analysis, we provide some guidelines on how to set the smoothing hyperparameter, and illustrate on synthetic data the interest of such guidelines.} }
Endnote
%0 Conference Paper %T Support recovery and sup-norm convergence rates for sparse pivotal estimation %A Mathurin Massias %A Quentin Bertrand %A Alexandre Gramfort %A Joseph Salmon %B Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2020 %E Silvia Chiappa %E Roberto Calandra %F pmlr-v108-massias20a %I PMLR %P 2655--2665 %U https://proceedings.mlr.press/v108/massias20a.html %V 108 %X In high dimensional sparse regression, pivotal estimators are estimators for which the optimal regularization parameter is independent of the noise level. The canonical pivotal estimator is the square-root Lasso, formulated along with its derivatives as a “non-smooth + non-smooth” optimization problem. Modern techniques to solve these include smoothing the datafitting term, to benefit from fast efficient proximal algorithms. In this work we show minimax sup-norm convergence rates for non smoothed and smoothed, single task and multitask square-root Lasso-type estimators. Thanks to our theoretical analysis, we provide some guidelines on how to set the smoothing hyperparameter, and illustrate on synthetic data the interest of such guidelines.
APA
Massias, M., Bertrand, Q., Gramfort, A. & Salmon, J.. (2020). Support recovery and sup-norm convergence rates for sparse pivotal estimation. Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 108:2655-2665 Available from https://proceedings.mlr.press/v108/massias20a.html.

Related Material