On Learnability via Gradient Method for Two-Layer ReLU Neural Networks in Teacher-Student Setting

Shunta Akiyama, Taiji Suzuki
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:152-162, 2021.

Abstract

Deep learning empirically achieves high performance in many applications, but its training dynamics has not been fully understood theoretically. In this paper, we explore theoretical analysis on training two-layer ReLU neural networks in a teacher-student regression model, in which a student network learns an unknown teacher network through its outputs. We show that with a specific regularization and sufficient over-parameterization, the student network can identify the parameters of the teacher network with high probability via gradient descent with a norm dependent stepsize even though the objective function is highly non-convex. The key theoretical tool is the measure representation of the neural networks and a novel application of a dual certificate argument for sparse estimation on a measure space. We analyze the global minima and global convergence property in the measure space.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-akiyama21a, title = {On Learnability via Gradient Method for Two-Layer ReLU Neural Networks in Teacher-Student Setting}, author = {Akiyama, Shunta and Suzuki, Taiji}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {152--162}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/akiyama21a/akiyama21a.pdf}, url = {https://proceedings.mlr.press/v139/akiyama21a.html}, abstract = {Deep learning empirically achieves high performance in many applications, but its training dynamics has not been fully understood theoretically. In this paper, we explore theoretical analysis on training two-layer ReLU neural networks in a teacher-student regression model, in which a student network learns an unknown teacher network through its outputs. We show that with a specific regularization and sufficient over-parameterization, the student network can identify the parameters of the teacher network with high probability via gradient descent with a norm dependent stepsize even though the objective function is highly non-convex. The key theoretical tool is the measure representation of the neural networks and a novel application of a dual certificate argument for sparse estimation on a measure space. We analyze the global minima and global convergence property in the measure space.} }
Endnote
%0 Conference Paper %T On Learnability via Gradient Method for Two-Layer ReLU Neural Networks in Teacher-Student Setting %A Shunta Akiyama %A Taiji Suzuki %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-akiyama21a %I PMLR %P 152--162 %U https://proceedings.mlr.press/v139/akiyama21a.html %V 139 %X Deep learning empirically achieves high performance in many applications, but its training dynamics has not been fully understood theoretically. In this paper, we explore theoretical analysis on training two-layer ReLU neural networks in a teacher-student regression model, in which a student network learns an unknown teacher network through its outputs. We show that with a specific regularization and sufficient over-parameterization, the student network can identify the parameters of the teacher network with high probability via gradient descent with a norm dependent stepsize even though the objective function is highly non-convex. The key theoretical tool is the measure representation of the neural networks and a novel application of a dual certificate argument for sparse estimation on a measure space. We analyze the global minima and global convergence property in the measure space.
APA
Akiyama, S. & Suzuki, T.. (2021). On Learnability via Gradient Method for Two-Layer ReLU Neural Networks in Teacher-Student Setting. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:152-162 Available from https://proceedings.mlr.press/v139/akiyama21a.html.

Related Material