Beyond Signal Propagation: Is Feature Diversity Necessary in Deep Neural Network Initialization?

Yaniv Blumenfeld, Dar Gilboa, Daniel Soudry
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:960-969, 2020.

Abstract

Deep neural networks are typically initialized with random weights, with variances chosen to facilitate signal propagation and stable gradients. It is also believed that diversity of features is an important property of these initializations. We construct a deep convolutional network with identical features by initializing almost all the weights to $0$. The architecture also enables perfect signal propagation and stable gradients, and achieves high accuracy on standard benchmarks. This indicates that random, diverse initializations are \emph{not} necessary for training neural networks. An essential element in training this network is a mechanism of symmetry breaking; we study this phenomenon and find that standard GPU operations, which are non-deterministic, can serve as a sufficient source of symmetry breaking to enable training.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-blumenfeld20a, title = {Beyond Signal Propagation: Is Feature Diversity Necessary in Deep Neural Network Initialization?}, author = {Blumenfeld, Yaniv and Gilboa, Dar and Soudry, Daniel}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {960--969}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/blumenfeld20a/blumenfeld20a.pdf}, url = {http://proceedings.mlr.press/v119/blumenfeld20a.html}, abstract = {Deep neural networks are typically initialized with random weights, with variances chosen to facilitate signal propagation and stable gradients. It is also believed that diversity of features is an important property of these initializations. We construct a deep convolutional network with identical features by initializing almost all the weights to $0$. The architecture also enables perfect signal propagation and stable gradients, and achieves high accuracy on standard benchmarks. This indicates that random, diverse initializations are \emph{not} necessary for training neural networks. An essential element in training this network is a mechanism of symmetry breaking; we study this phenomenon and find that standard GPU operations, which are non-deterministic, can serve as a sufficient source of symmetry breaking to enable training.} }
Endnote
%0 Conference Paper %T Beyond Signal Propagation: Is Feature Diversity Necessary in Deep Neural Network Initialization? %A Yaniv Blumenfeld %A Dar Gilboa %A Daniel Soudry %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-blumenfeld20a %I PMLR %P 960--969 %U http://proceedings.mlr.press/v119/blumenfeld20a.html %V 119 %X Deep neural networks are typically initialized with random weights, with variances chosen to facilitate signal propagation and stable gradients. It is also believed that diversity of features is an important property of these initializations. We construct a deep convolutional network with identical features by initializing almost all the weights to $0$. The architecture also enables perfect signal propagation and stable gradients, and achieves high accuracy on standard benchmarks. This indicates that random, diverse initializations are \emph{not} necessary for training neural networks. An essential element in training this network is a mechanism of symmetry breaking; we study this phenomenon and find that standard GPU operations, which are non-deterministic, can serve as a sufficient source of symmetry breaking to enable training.
APA
Blumenfeld, Y., Gilboa, D. & Soudry, D.. (2020). Beyond Signal Propagation: Is Feature Diversity Necessary in Deep Neural Network Initialization?. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:960-969 Available from http://proceedings.mlr.press/v119/blumenfeld20a.html.

Related Material