Ridge Regression with Over-parametrized Two-Layer Networks Converge to Ridgelet Spectrum

Sho Sonoda, Isao Ishikawa, Masahiro Ikeda
Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, PMLR 130:2674-2682, 2021.

Abstract

Characterization of local minima draws much attention in theoretical studies of deep learning. In this study, we investigate the distribution of parameters in an over-parametrized finite neural network trained by ridge regularized empirical square risk minimization (RERM). We develop a new theory of ridgelet transform, a wavelet-like integral transform that provides a powerful and general framework for the theoretical study of neural networks involving not only the ReLU but general activation functions. We show that the distribution of the parameters converges to a spectrum of the ridgelet transform. This result provides a new insight into the characterization of the local minima of neural networks, and the theoretical background of an inductive bias theory based on lazy regimes. We confirm the visual resemblance between the parameter distribution trained by SGD, and the ridgelet spectrum calculated by numerical integration through numerical experiments with finite models.

Cite this Paper


BibTeX
@InProceedings{pmlr-v130-sonoda21a, title = { Ridge Regression with Over-parametrized Two-Layer Networks Converge to Ridgelet Spectrum }, author = {Sonoda, Sho and Ishikawa, Isao and Ikeda, Masahiro}, booktitle = {Proceedings of The 24th International Conference on Artificial Intelligence and Statistics}, pages = {2674--2682}, year = {2021}, editor = {Banerjee, Arindam and Fukumizu, Kenji}, volume = {130}, series = {Proceedings of Machine Learning Research}, month = {13--15 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v130/sonoda21a/sonoda21a.pdf}, url = {https://proceedings.mlr.press/v130/sonoda21a.html}, abstract = { Characterization of local minima draws much attention in theoretical studies of deep learning. In this study, we investigate the distribution of parameters in an over-parametrized finite neural network trained by ridge regularized empirical square risk minimization (RERM). We develop a new theory of ridgelet transform, a wavelet-like integral transform that provides a powerful and general framework for the theoretical study of neural networks involving not only the ReLU but general activation functions. We show that the distribution of the parameters converges to a spectrum of the ridgelet transform. This result provides a new insight into the characterization of the local minima of neural networks, and the theoretical background of an inductive bias theory based on lazy regimes. We confirm the visual resemblance between the parameter distribution trained by SGD, and the ridgelet spectrum calculated by numerical integration through numerical experiments with finite models. } }
Endnote
%0 Conference Paper %T Ridge Regression with Over-parametrized Two-Layer Networks Converge to Ridgelet Spectrum %A Sho Sonoda %A Isao Ishikawa %A Masahiro Ikeda %B Proceedings of The 24th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2021 %E Arindam Banerjee %E Kenji Fukumizu %F pmlr-v130-sonoda21a %I PMLR %P 2674--2682 %U https://proceedings.mlr.press/v130/sonoda21a.html %V 130 %X Characterization of local minima draws much attention in theoretical studies of deep learning. In this study, we investigate the distribution of parameters in an over-parametrized finite neural network trained by ridge regularized empirical square risk minimization (RERM). We develop a new theory of ridgelet transform, a wavelet-like integral transform that provides a powerful and general framework for the theoretical study of neural networks involving not only the ReLU but general activation functions. We show that the distribution of the parameters converges to a spectrum of the ridgelet transform. This result provides a new insight into the characterization of the local minima of neural networks, and the theoretical background of an inductive bias theory based on lazy regimes. We confirm the visual resemblance between the parameter distribution trained by SGD, and the ridgelet spectrum calculated by numerical integration through numerical experiments with finite models.
APA
Sonoda, S., Ishikawa, I. & Ikeda, M.. (2021). Ridge Regression with Over-parametrized Two-Layer Networks Converge to Ridgelet Spectrum . Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 130:2674-2682 Available from https://proceedings.mlr.press/v130/sonoda21a.html.

Related Material