The Spectrum of Fisher Information of Deep Networks Achieving Dynamical Isometry

Tomohiro Hayase, Ryo Karakida
Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, PMLR 130:334-342, 2021.

Abstract

The Fisher information matrix (FIM) is fundamental to understanding the trainability of deep neural nets (DNN), since it describes the parameter space’s local metric. We investigate the spectral distribution of the conditional FIM, which is the FIM given a single sample, by focusing on fully-connected networks achieving dynamical isometry. Then, while dynamical isometry is known to keep specific backpropagated signals independent of the depth, we find that the parameter space’s local metric linearly depends on the depth even under the dynamical isometry. More precisely, we reveal that the conditional FIM’s spectrum concentrates around the maximum and the value grows linearly as the depth increases. To examine the spectrum, considering random initialization and the wide limit, we construct an algebraic methodology based on the free probability theory. As a byproduct, we provide an analysis of the solvable spectral distribution in two-hidden-layer cases. Lastly, experimental results verify that the appropriate learning rate for the online training of DNNs is in inverse proportional to depth, which is determined by the conditional FIM’s spectrum.

Cite this Paper


BibTeX
@InProceedings{pmlr-v130-hayase21a, title = { The Spectrum of Fisher Information of Deep Networks Achieving Dynamical Isometry }, author = {Hayase, Tomohiro and Karakida, Ryo}, booktitle = {Proceedings of The 24th International Conference on Artificial Intelligence and Statistics}, pages = {334--342}, year = {2021}, editor = {Banerjee, Arindam and Fukumizu, Kenji}, volume = {130}, series = {Proceedings of Machine Learning Research}, month = {13--15 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v130/hayase21a/hayase21a.pdf}, url = {https://proceedings.mlr.press/v130/hayase21a.html}, abstract = { The Fisher information matrix (FIM) is fundamental to understanding the trainability of deep neural nets (DNN), since it describes the parameter space’s local metric. We investigate the spectral distribution of the conditional FIM, which is the FIM given a single sample, by focusing on fully-connected networks achieving dynamical isometry. Then, while dynamical isometry is known to keep specific backpropagated signals independent of the depth, we find that the parameter space’s local metric linearly depends on the depth even under the dynamical isometry. More precisely, we reveal that the conditional FIM’s spectrum concentrates around the maximum and the value grows linearly as the depth increases. To examine the spectrum, considering random initialization and the wide limit, we construct an algebraic methodology based on the free probability theory. As a byproduct, we provide an analysis of the solvable spectral distribution in two-hidden-layer cases. Lastly, experimental results verify that the appropriate learning rate for the online training of DNNs is in inverse proportional to depth, which is determined by the conditional FIM’s spectrum. } }
Endnote
%0 Conference Paper %T The Spectrum of Fisher Information of Deep Networks Achieving Dynamical Isometry %A Tomohiro Hayase %A Ryo Karakida %B Proceedings of The 24th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2021 %E Arindam Banerjee %E Kenji Fukumizu %F pmlr-v130-hayase21a %I PMLR %P 334--342 %U https://proceedings.mlr.press/v130/hayase21a.html %V 130 %X The Fisher information matrix (FIM) is fundamental to understanding the trainability of deep neural nets (DNN), since it describes the parameter space’s local metric. We investigate the spectral distribution of the conditional FIM, which is the FIM given a single sample, by focusing on fully-connected networks achieving dynamical isometry. Then, while dynamical isometry is known to keep specific backpropagated signals independent of the depth, we find that the parameter space’s local metric linearly depends on the depth even under the dynamical isometry. More precisely, we reveal that the conditional FIM’s spectrum concentrates around the maximum and the value grows linearly as the depth increases. To examine the spectrum, considering random initialization and the wide limit, we construct an algebraic methodology based on the free probability theory. As a byproduct, we provide an analysis of the solvable spectral distribution in two-hidden-layer cases. Lastly, experimental results verify that the appropriate learning rate for the online training of DNNs is in inverse proportional to depth, which is determined by the conditional FIM’s spectrum.
APA
Hayase, T. & Karakida, R.. (2021). The Spectrum of Fisher Information of Deep Networks Achieving Dynamical Isometry . Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 130:334-342 Available from https://proceedings.mlr.press/v130/hayase21a.html.

Related Material