The Dynamics of Learning: A Random Matrix Approach

Zhenyu Liao, Romain Couillet
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:3072-3081, 2018.

Abstract

Understanding the learning dynamics of neural networks is one of the key issues for the improvement of optimization algorithms as well as for the theoretical comprehension of why deep neural nets work so well today. In this paper, we introduce a random matrix-based framework to analyze the learning dynamics of a single-layer linear network on a binary classification problem, for data of simultaneously large dimension and size, trained by gradient descent. Our results provide rich insights into common questions in neural nets, such as overfitting, early stopping and the initialization of training, thereby opening the door for future studies of more elaborate structures and models appearing in today’s neural networks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-liao18b, title = {The Dynamics of Learning: A Random Matrix Approach}, author = {Liao, Zhenyu and Couillet, Romain}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {3072--3081}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/liao18b/liao18b.pdf}, url = {https://proceedings.mlr.press/v80/liao18b.html}, abstract = {Understanding the learning dynamics of neural networks is one of the key issues for the improvement of optimization algorithms as well as for the theoretical comprehension of why deep neural nets work so well today. In this paper, we introduce a random matrix-based framework to analyze the learning dynamics of a single-layer linear network on a binary classification problem, for data of simultaneously large dimension and size, trained by gradient descent. Our results provide rich insights into common questions in neural nets, such as overfitting, early stopping and the initialization of training, thereby opening the door for future studies of more elaborate structures and models appearing in today’s neural networks.} }
Endnote
%0 Conference Paper %T The Dynamics of Learning: A Random Matrix Approach %A Zhenyu Liao %A Romain Couillet %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-liao18b %I PMLR %P 3072--3081 %U https://proceedings.mlr.press/v80/liao18b.html %V 80 %X Understanding the learning dynamics of neural networks is one of the key issues for the improvement of optimization algorithms as well as for the theoretical comprehension of why deep neural nets work so well today. In this paper, we introduce a random matrix-based framework to analyze the learning dynamics of a single-layer linear network on a binary classification problem, for data of simultaneously large dimension and size, trained by gradient descent. Our results provide rich insights into common questions in neural nets, such as overfitting, early stopping and the initialization of training, thereby opening the door for future studies of more elaborate structures and models appearing in today’s neural networks.
APA
Liao, Z. & Couillet, R.. (2018). The Dynamics of Learning: A Random Matrix Approach. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:3072-3081 Available from https://proceedings.mlr.press/v80/liao18b.html.

Related Material