Kernel regression in high dimensions: Refined analysis beyond double descent

Fanghui Liu, Zhenyu Liao, Johan Suykens
Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, PMLR 130:649-657, 2021.

Abstract

In this paper, we provide a precise characterization of generalization properties of high dimensional kernel ridge regression across the under- and over-parameterized regimes, depending on whether the number of training data n exceeds the feature dimension d. By establishing a bias-variance decomposition of the expected excess risk, we show that, while the bias is (almost) independent of d and monotonically decreases with n, the variance depends on n,d and can be unimodal or monotonically decreasing under different regularization schemes. Our refined analysis goes beyond the double descent theory by showing that, depending on the data eigen-profile and the level of regularization, the kernel regression risk curve can be a double-descent-like, bell-shaped, or monotonic function of n. Experiments on synthetic and real data are conducted to support our theoretical findings.

Cite this Paper


BibTeX
@InProceedings{pmlr-v130-liu21b, title = { Kernel regression in high dimensions: Refined analysis beyond double descent }, author = {Liu, Fanghui and Liao, Zhenyu and Suykens, Johan}, booktitle = {Proceedings of The 24th International Conference on Artificial Intelligence and Statistics}, pages = {649--657}, year = {2021}, editor = {Banerjee, Arindam and Fukumizu, Kenji}, volume = {130}, series = {Proceedings of Machine Learning Research}, month = {13--15 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v130/liu21b/liu21b.pdf}, url = {https://proceedings.mlr.press/v130/liu21b.html}, abstract = { In this paper, we provide a precise characterization of generalization properties of high dimensional kernel ridge regression across the under- and over-parameterized regimes, depending on whether the number of training data n exceeds the feature dimension d. By establishing a bias-variance decomposition of the expected excess risk, we show that, while the bias is (almost) independent of d and monotonically decreases with n, the variance depends on n,d and can be unimodal or monotonically decreasing under different regularization schemes. Our refined analysis goes beyond the double descent theory by showing that, depending on the data eigen-profile and the level of regularization, the kernel regression risk curve can be a double-descent-like, bell-shaped, or monotonic function of n. Experiments on synthetic and real data are conducted to support our theoretical findings. } }
Endnote
%0 Conference Paper %T Kernel regression in high dimensions: Refined analysis beyond double descent %A Fanghui Liu %A Zhenyu Liao %A Johan Suykens %B Proceedings of The 24th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2021 %E Arindam Banerjee %E Kenji Fukumizu %F pmlr-v130-liu21b %I PMLR %P 649--657 %U https://proceedings.mlr.press/v130/liu21b.html %V 130 %X In this paper, we provide a precise characterization of generalization properties of high dimensional kernel ridge regression across the under- and over-parameterized regimes, depending on whether the number of training data n exceeds the feature dimension d. By establishing a bias-variance decomposition of the expected excess risk, we show that, while the bias is (almost) independent of d and monotonically decreases with n, the variance depends on n,d and can be unimodal or monotonically decreasing under different regularization schemes. Our refined analysis goes beyond the double descent theory by showing that, depending on the data eigen-profile and the level of regularization, the kernel regression risk curve can be a double-descent-like, bell-shaped, or monotonic function of n. Experiments on synthetic and real data are conducted to support our theoretical findings.
APA
Liu, F., Liao, Z. & Suykens, J.. (2021). Kernel regression in high dimensions: Refined analysis beyond double descent . Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 130:649-657 Available from https://proceedings.mlr.press/v130/liu21b.html.

Related Material