An Initial Alignment between Neural Network and Target is Needed for Gradient Descent to Learn

Emmanuel Abbe, Elisabetta Cornacchia, Jan Hazla, Christopher Marquis
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:33-52, 2022.

Abstract

This paper introduces the notion of “Initial Alignment” (INAL) between a neural network at initialization and a target function. It is proved that if a network and a Boolean target function do not have a noticeable INAL, then noisy gradient descent with normalized i.i.d. initialization will not learn in polynomial time. Thus a certain amount of knowledge about the target (measured by the INAL) is needed in the architecture design. This also provides an answer to an open problem posed in (AS-NeurIPS’20). The results are based on deriving lower-bounds for descent algorithms on symmetric neural networks without explicit knowledge of the target function beyond its INAL.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-abbe22a, title = {An Initial Alignment between Neural Network and Target is Needed for Gradient Descent to Learn}, author = {Abbe, Emmanuel and Cornacchia, Elisabetta and Hazla, Jan and Marquis, Christopher}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {33--52}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/abbe22a/abbe22a.pdf}, url = {https://proceedings.mlr.press/v162/abbe22a.html}, abstract = {This paper introduces the notion of “Initial Alignment” (INAL) between a neural network at initialization and a target function. It is proved that if a network and a Boolean target function do not have a noticeable INAL, then noisy gradient descent with normalized i.i.d. initialization will not learn in polynomial time. Thus a certain amount of knowledge about the target (measured by the INAL) is needed in the architecture design. This also provides an answer to an open problem posed in (AS-NeurIPS’20). The results are based on deriving lower-bounds for descent algorithms on symmetric neural networks without explicit knowledge of the target function beyond its INAL.} }
Endnote
%0 Conference Paper %T An Initial Alignment between Neural Network and Target is Needed for Gradient Descent to Learn %A Emmanuel Abbe %A Elisabetta Cornacchia %A Jan Hazla %A Christopher Marquis %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-abbe22a %I PMLR %P 33--52 %U https://proceedings.mlr.press/v162/abbe22a.html %V 162 %X This paper introduces the notion of “Initial Alignment” (INAL) between a neural network at initialization and a target function. It is proved that if a network and a Boolean target function do not have a noticeable INAL, then noisy gradient descent with normalized i.i.d. initialization will not learn in polynomial time. Thus a certain amount of knowledge about the target (measured by the INAL) is needed in the architecture design. This also provides an answer to an open problem posed in (AS-NeurIPS’20). The results are based on deriving lower-bounds for descent algorithms on symmetric neural networks without explicit knowledge of the target function beyond its INAL.
APA
Abbe, E., Cornacchia, E., Hazla, J. & Marquis, C.. (2022). An Initial Alignment between Neural Network and Target is Needed for Gradient Descent to Learn. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:33-52 Available from https://proceedings.mlr.press/v162/abbe22a.html.

Related Material