The Slow Deterioration of the Generalization Error of the Random Feature Model

Chao Ma, Lei Wu, Weinan E
Proceedings of The First Mathematical and Scientific Machine Learning Conference, PMLR 107:373-389, 2020.

Abstract

The random feature model exhibits a kind of resonance behavior when the number of parameters is close to the training sample size. This behavior is characterized by the appearance of large generalization gap, and is due to the occurrence of very small eigenvalues for the associated Gram matrix. In this paper, we examine the dynamic behavior of the gradient descent algorithm in this regime. We show, both theoretically and experimentally, that there is a dynamic self-correction mechanism at work: The larger the eventual generalization gap, the slower it develops, both because of the small eigenvalues. This gives us ample time to stop the training process and obtain solutions with good generalization property.

Cite this Paper


BibTeX
@InProceedings{pmlr-v107-ma20a, title = {The Slow Deterioration of the Generalization Error of the Random Feature Model}, author = {Ma, Chao and Wu, Lei and E, Weinan}, booktitle = {Proceedings of The First Mathematical and Scientific Machine Learning Conference}, pages = {373--389}, year = {2020}, editor = {Lu, Jianfeng and Ward, Rachel}, volume = {107}, series = {Proceedings of Machine Learning Research}, month = {20--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v107/ma20a/ma20a.pdf}, url = {https://proceedings.mlr.press/v107/ma20a.html}, abstract = {The random feature model exhibits a kind of resonance behavior when the number of parameters is close to the training sample size. This behavior is characterized by the appearance of large generalization gap, and is due to the occurrence of very small eigenvalues for the associated Gram matrix. In this paper, we examine the dynamic behavior of the gradient descent algorithm in this regime. We show, both theoretically and experimentally, that there is a dynamic self-correction mechanism at work: The larger the eventual generalization gap, the slower it develops, both because of the small eigenvalues. This gives us ample time to stop the training process and obtain solutions with good generalization property. } }
Endnote
%0 Conference Paper %T The Slow Deterioration of the Generalization Error of the Random Feature Model %A Chao Ma %A Lei Wu %A Weinan E %B Proceedings of The First Mathematical and Scientific Machine Learning Conference %C Proceedings of Machine Learning Research %D 2020 %E Jianfeng Lu %E Rachel Ward %F pmlr-v107-ma20a %I PMLR %P 373--389 %U https://proceedings.mlr.press/v107/ma20a.html %V 107 %X The random feature model exhibits a kind of resonance behavior when the number of parameters is close to the training sample size. This behavior is characterized by the appearance of large generalization gap, and is due to the occurrence of very small eigenvalues for the associated Gram matrix. In this paper, we examine the dynamic behavior of the gradient descent algorithm in this regime. We show, both theoretically and experimentally, that there is a dynamic self-correction mechanism at work: The larger the eventual generalization gap, the slower it develops, both because of the small eigenvalues. This gives us ample time to stop the training process and obtain solutions with good generalization property.
APA
Ma, C., Wu, L. & E, W.. (2020). The Slow Deterioration of the Generalization Error of the Random Feature Model. Proceedings of The First Mathematical and Scientific Machine Learning Conference, in Proceedings of Machine Learning Research 107:373-389 Available from https://proceedings.mlr.press/v107/ma20a.html.

Related Material