Fast rates for noisy interpolation require rethinking the effect of inductive bias

Konstantin Donhauser, Nicolò Ruggeri, Stefan Stojanovic, Fanny Yang
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:5397-5428, 2022.

Abstract

Good generalization performance on high-dimensional data crucially hinges on a simple structure of the ground truth and a corresponding strong inductive bias of the estimator. Even though this intuition is valid for regularized models, in this paper we caution against a strong inductive bias for interpolation in the presence of noise: While a stronger inductive bias encourages a simpler structure that is more aligned with the ground truth, it also increases the detrimental effect of noise. Specifically, for both linear regression and classification with a sparse ground truth, we prove that minimum $\ell_p$-norm and maximum $\ell_p$-margin interpolators achieve fast polynomial rates close to order $1/n$ for $p > 1$ compared to a logarithmic rate for $p = 1$. Finally, we provide preliminary experimental evidence that this trade-off may also play a crucial role in understanding non-linear interpolating models used in practice.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-donhauser22a, title = {Fast rates for noisy interpolation require rethinking the effect of inductive bias}, author = {Donhauser, Konstantin and Ruggeri, Nicol{\`o} and Stojanovic, Stefan and Yang, Fanny}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {5397--5428}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/donhauser22a/donhauser22a.pdf}, url = {https://proceedings.mlr.press/v162/donhauser22a.html}, abstract = {Good generalization performance on high-dimensional data crucially hinges on a simple structure of the ground truth and a corresponding strong inductive bias of the estimator. Even though this intuition is valid for regularized models, in this paper we caution against a strong inductive bias for interpolation in the presence of noise: While a stronger inductive bias encourages a simpler structure that is more aligned with the ground truth, it also increases the detrimental effect of noise. Specifically, for both linear regression and classification with a sparse ground truth, we prove that minimum $\ell_p$-norm and maximum $\ell_p$-margin interpolators achieve fast polynomial rates close to order $1/n$ for $p > 1$ compared to a logarithmic rate for $p = 1$. Finally, we provide preliminary experimental evidence that this trade-off may also play a crucial role in understanding non-linear interpolating models used in practice.} }
Endnote
%0 Conference Paper %T Fast rates for noisy interpolation require rethinking the effect of inductive bias %A Konstantin Donhauser %A Nicolò Ruggeri %A Stefan Stojanovic %A Fanny Yang %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-donhauser22a %I PMLR %P 5397--5428 %U https://proceedings.mlr.press/v162/donhauser22a.html %V 162 %X Good generalization performance on high-dimensional data crucially hinges on a simple structure of the ground truth and a corresponding strong inductive bias of the estimator. Even though this intuition is valid for regularized models, in this paper we caution against a strong inductive bias for interpolation in the presence of noise: While a stronger inductive bias encourages a simpler structure that is more aligned with the ground truth, it also increases the detrimental effect of noise. Specifically, for both linear regression and classification with a sparse ground truth, we prove that minimum $\ell_p$-norm and maximum $\ell_p$-margin interpolators achieve fast polynomial rates close to order $1/n$ for $p > 1$ compared to a logarithmic rate for $p = 1$. Finally, we provide preliminary experimental evidence that this trade-off may also play a crucial role in understanding non-linear interpolating models used in practice.
APA
Donhauser, K., Ruggeri, N., Stojanovic, S. & Yang, F.. (2022). Fast rates for noisy interpolation require rethinking the effect of inductive bias. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:5397-5428 Available from https://proceedings.mlr.press/v162/donhauser22a.html.

Related Material