Modeling from Features: a Mean-field Framework for Over-parameterized Deep Neural Networks

Cong Fang, Jason Lee, Pengkun Yang, Tong Zhang
Proceedings of Thirty Fourth Conference on Learning Theory, PMLR 134:1887-1936, 2021.

Abstract

This paper proposes a new mean-field framework for over-parameterized deep neural networks (DNNs), which can be used to analyze neural network training. In this framework, a DNN is represented by probability measures and functions over its features (that is, the function values of the hidden units over the training data) in the continuous limit, instead of the neural network parameters as most existing studies have done. This new representation overcomes the degenerate situation where all the hidden units essentially have only one meaningful hidden unit in each middle layer, leading to a simpler representation of DNNs. Moreover, we construct a non-linear dynamics called neural feature flow, which captures the evolution of an over-parameterized DNN trained by Gradient Descent. We illustrate the framework via the Residual Network (Res-Net) architecture. It is shown that when the neural feature flow process converges, it reaches a global minimal solution under suitable conditions.

Cite this Paper


BibTeX
@InProceedings{pmlr-v134-fang21a, title = {Modeling from Features: a Mean-field Framework for Over-parameterized Deep Neural Networks}, author = {Fang, Cong and Lee, Jason and Yang, Pengkun and Zhang, Tong}, booktitle = {Proceedings of Thirty Fourth Conference on Learning Theory}, pages = {1887--1936}, year = {2021}, editor = {Belkin, Mikhail and Kpotufe, Samory}, volume = {134}, series = {Proceedings of Machine Learning Research}, month = {15--19 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v134/fang21a/fang21a.pdf}, url = {https://proceedings.mlr.press/v134/fang21a.html}, abstract = {This paper proposes a new mean-field framework for over-parameterized deep neural networks (DNNs), which can be used to analyze neural network training. In this framework, a DNN is represented by probability measures and functions over its features (that is, the function values of the hidden units over the training data) in the continuous limit, instead of the neural network parameters as most existing studies have done. This new representation overcomes the degenerate situation where all the hidden units essentially have only one meaningful hidden unit in each middle layer, leading to a simpler representation of DNNs. Moreover, we construct a non-linear dynamics called neural feature flow, which captures the evolution of an over-parameterized DNN trained by Gradient Descent. We illustrate the framework via the Residual Network (Res-Net) architecture. It is shown that when the neural feature flow process converges, it reaches a global minimal solution under suitable conditions.} }
Endnote
%0 Conference Paper %T Modeling from Features: a Mean-field Framework for Over-parameterized Deep Neural Networks %A Cong Fang %A Jason Lee %A Pengkun Yang %A Tong Zhang %B Proceedings of Thirty Fourth Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2021 %E Mikhail Belkin %E Samory Kpotufe %F pmlr-v134-fang21a %I PMLR %P 1887--1936 %U https://proceedings.mlr.press/v134/fang21a.html %V 134 %X This paper proposes a new mean-field framework for over-parameterized deep neural networks (DNNs), which can be used to analyze neural network training. In this framework, a DNN is represented by probability measures and functions over its features (that is, the function values of the hidden units over the training data) in the continuous limit, instead of the neural network parameters as most existing studies have done. This new representation overcomes the degenerate situation where all the hidden units essentially have only one meaningful hidden unit in each middle layer, leading to a simpler representation of DNNs. Moreover, we construct a non-linear dynamics called neural feature flow, which captures the evolution of an over-parameterized DNN trained by Gradient Descent. We illustrate the framework via the Residual Network (Res-Net) architecture. It is shown that when the neural feature flow process converges, it reaches a global minimal solution under suitable conditions.
APA
Fang, C., Lee, J., Yang, P. & Zhang, T.. (2021). Modeling from Features: a Mean-field Framework for Over-parameterized Deep Neural Networks. Proceedings of Thirty Fourth Conference on Learning Theory, in Proceedings of Machine Learning Research 134:1887-1936 Available from https://proceedings.mlr.press/v134/fang21a.html.

Related Material