Tight bounds for minimum $\ell_1$-norm interpolation of noisy data

Guillaume Wang, Konstantin Donhauser, Fanny Yang
Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, PMLR 151:10572-10602, 2022.

Abstract

We provide matching upper and lower bounds of order $\sigma^2/\log(d/n)$ for the prediction error of the minimum $\ell_1$-norm interpolator, a.k.a. basis pursuit. Our result is tight up to negligible terms when $d \gg n$, and is the first to imply asymptotic consistency of noisy minimum-norm interpolation for isotropic features and sparse ground truths. Our work complements the literature on "benign overfitting" for minimum $\ell_2$-norm interpolation, where asymptotic consistency can be achieved only when the features are effectively low-dimensional.

Cite this Paper


BibTeX
@InProceedings{pmlr-v151-wang22k, title = { Tight bounds for minimum $\ell_1$-norm interpolation of noisy data }, author = {Wang, Guillaume and Donhauser, Konstantin and Yang, Fanny}, booktitle = {Proceedings of The 25th International Conference on Artificial Intelligence and Statistics}, pages = {10572--10602}, year = {2022}, editor = {Camps-Valls, Gustau and Ruiz, Francisco J. R. and Valera, Isabel}, volume = {151}, series = {Proceedings of Machine Learning Research}, month = {28--30 Mar}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v151/wang22k/wang22k.pdf}, url = {https://proceedings.mlr.press/v151/wang22k.html}, abstract = { We provide matching upper and lower bounds of order $\sigma^2/\log(d/n)$ for the prediction error of the minimum $\ell_1$-norm interpolator, a.k.a. basis pursuit. Our result is tight up to negligible terms when $d \gg n$, and is the first to imply asymptotic consistency of noisy minimum-norm interpolation for isotropic features and sparse ground truths. Our work complements the literature on "benign overfitting" for minimum $\ell_2$-norm interpolation, where asymptotic consistency can be achieved only when the features are effectively low-dimensional. } }
Endnote
%0 Conference Paper %T Tight bounds for minimum $\ell_1$-norm interpolation of noisy data %A Guillaume Wang %A Konstantin Donhauser %A Fanny Yang %B Proceedings of The 25th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2022 %E Gustau Camps-Valls %E Francisco J. R. Ruiz %E Isabel Valera %F pmlr-v151-wang22k %I PMLR %P 10572--10602 %U https://proceedings.mlr.press/v151/wang22k.html %V 151 %X We provide matching upper and lower bounds of order $\sigma^2/\log(d/n)$ for the prediction error of the minimum $\ell_1$-norm interpolator, a.k.a. basis pursuit. Our result is tight up to negligible terms when $d \gg n$, and is the first to imply asymptotic consistency of noisy minimum-norm interpolation for isotropic features and sparse ground truths. Our work complements the literature on "benign overfitting" for minimum $\ell_2$-norm interpolation, where asymptotic consistency can be achieved only when the features are effectively low-dimensional.
APA
Wang, G., Donhauser, K. & Yang, F.. (2022). Tight bounds for minimum $\ell_1$-norm interpolation of noisy data . Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 151:10572-10602 Available from https://proceedings.mlr.press/v151/wang22k.html.

Related Material