Information-Theoretic Local Minima Characterization and Regularization

Zhiwei Jia, Hao Su
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:4773-4783, 2020.

Abstract

Recent advances in deep learning theory have evoked the study of generalizability across different local minima of deep neural networks (DNNs). While current work focused on either discovering properties of good local minima or developing regularization techniques to induce good local minima, no approach exists that can tackle both problems. We achieve these two goals successfully in a unified manner. Specifically, based on the observed Fisher information we propose a metric both strongly indicative of generalizability of local minima and effectively applied as a practical regularizer. We provide theoretical analysis including a generalization bound and empirically demonstrate the success of our approach in both capturing and improving the generalizability of DNNs. Experiments are performed on CIFAR-10, CIFAR-100 and ImageNet for various network architectures.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-jia20a, title = {Information-Theoretic Local Minima Characterization and Regularization}, author = {Jia, Zhiwei and Su, Hao}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {4773--4783}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/jia20a/jia20a.pdf}, url = {https://proceedings.mlr.press/v119/jia20a.html}, abstract = {Recent advances in deep learning theory have evoked the study of generalizability across different local minima of deep neural networks (DNNs). While current work focused on either discovering properties of good local minima or developing regularization techniques to induce good local minima, no approach exists that can tackle both problems. We achieve these two goals successfully in a unified manner. Specifically, based on the observed Fisher information we propose a metric both strongly indicative of generalizability of local minima and effectively applied as a practical regularizer. We provide theoretical analysis including a generalization bound and empirically demonstrate the success of our approach in both capturing and improving the generalizability of DNNs. Experiments are performed on CIFAR-10, CIFAR-100 and ImageNet for various network architectures.} }
Endnote
%0 Conference Paper %T Information-Theoretic Local Minima Characterization and Regularization %A Zhiwei Jia %A Hao Su %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-jia20a %I PMLR %P 4773--4783 %U https://proceedings.mlr.press/v119/jia20a.html %V 119 %X Recent advances in deep learning theory have evoked the study of generalizability across different local minima of deep neural networks (DNNs). While current work focused on either discovering properties of good local minima or developing regularization techniques to induce good local minima, no approach exists that can tackle both problems. We achieve these two goals successfully in a unified manner. Specifically, based on the observed Fisher information we propose a metric both strongly indicative of generalizability of local minima and effectively applied as a practical regularizer. We provide theoretical analysis including a generalization bound and empirically demonstrate the success of our approach in both capturing and improving the generalizability of DNNs. Experiments are performed on CIFAR-10, CIFAR-100 and ImageNet for various network architectures.
APA
Jia, Z. & Su, H.. (2020). Information-Theoretic Local Minima Characterization and Regularization. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:4773-4783 Available from https://proceedings.mlr.press/v119/jia20a.html.

Related Material