Mean-field Analysis on Two-layer Neural Networks from a Kernel Perspective

Shokichi Takakura, Taiji Suzuki
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:47475-47509, 2024.

Abstract

In this paper, we study the feature learning ability of two-layer neural networks in the mean-field regime through the lens of kernel methods. To focus on the dynamics of the kernel induced by the first layer, we utilize a two-timescale limit, where the second layer moves much faster than the first layer. In this limit, the learning problem is reduced to the minimization problem over the intrinsic kernel. Then, we show the global convergence of the mean-field Langevin dynamics and derive time and particle discretization error. We also demonstrate that two-layer neural networks can learn a union of multiple reproducing kernel Hilbert spaces more efficiently than any kernel methods, and neural networks aquire data-dependent kernel which aligns with the target function. In addition, we develop a label noise procedure, which converges to the global optimum and show that the degrees of freedom appears as an implicit reguralization.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-takakura24a, title = {Mean-field Analysis on Two-layer Neural Networks from a Kernel Perspective}, author = {Takakura, Shokichi and Suzuki, Taiji}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {47475--47509}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/takakura24a/takakura24a.pdf}, url = {https://proceedings.mlr.press/v235/takakura24a.html}, abstract = {In this paper, we study the feature learning ability of two-layer neural networks in the mean-field regime through the lens of kernel methods. To focus on the dynamics of the kernel induced by the first layer, we utilize a two-timescale limit, where the second layer moves much faster than the first layer. In this limit, the learning problem is reduced to the minimization problem over the intrinsic kernel. Then, we show the global convergence of the mean-field Langevin dynamics and derive time and particle discretization error. We also demonstrate that two-layer neural networks can learn a union of multiple reproducing kernel Hilbert spaces more efficiently than any kernel methods, and neural networks aquire data-dependent kernel which aligns with the target function. In addition, we develop a label noise procedure, which converges to the global optimum and show that the degrees of freedom appears as an implicit reguralization.} }
Endnote
%0 Conference Paper %T Mean-field Analysis on Two-layer Neural Networks from a Kernel Perspective %A Shokichi Takakura %A Taiji Suzuki %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-takakura24a %I PMLR %P 47475--47509 %U https://proceedings.mlr.press/v235/takakura24a.html %V 235 %X In this paper, we study the feature learning ability of two-layer neural networks in the mean-field regime through the lens of kernel methods. To focus on the dynamics of the kernel induced by the first layer, we utilize a two-timescale limit, where the second layer moves much faster than the first layer. In this limit, the learning problem is reduced to the minimization problem over the intrinsic kernel. Then, we show the global convergence of the mean-field Langevin dynamics and derive time and particle discretization error. We also demonstrate that two-layer neural networks can learn a union of multiple reproducing kernel Hilbert spaces more efficiently than any kernel methods, and neural networks aquire data-dependent kernel which aligns with the target function. In addition, we develop a label noise procedure, which converges to the global optimum and show that the degrees of freedom appears as an implicit reguralization.
APA
Takakura, S. & Suzuki, T.. (2024). Mean-field Analysis on Two-layer Neural Networks from a Kernel Perspective. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:47475-47509 Available from https://proceedings.mlr.press/v235/takakura24a.html.

Related Material