The Dynamics of Learning: A Random Matrix Approach

[edit]

Zhenyu Liao, Romain Couillet ;
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:3072-3081, 2018.

Abstract

Understanding the learning dynamics of neural networks is one of the key issues for the improvement of optimization algorithms as well as for the theoretical comprehension of why deep neural nets work so well today. In this paper, we introduce a random matrix-based framework to analyze the learning dynamics of a single-layer linear network on a binary classification problem, for data of simultaneously large dimension and size, trained by gradient descent. Our results provide rich insights into common questions in neural nets, such as overfitting, early stopping and the initialization of training, thereby opening the door for future studies of more elaborate structures and models appearing in today’s neural networks.

Related Material