The Hessian perspective into the Nature of Convolutional Neural Networks

Sidak Pal Singh, Thomas Hofmann, Bernhard Schölkopf
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:31930-31968, 2023.

Abstract

While Convolutional Neural Networks (CNNs) have long been investigated and applied, as well as theorized, we aim to provide a slightly different perspective into their nature — through the perspective of their Hessian maps. The reason is that the loss Hessian captures the pairwise interaction of parameters and therefore forms a natural ground to probe how the architectural aspects of CNNs get manifested in their structure and properties. We develop a framework relying on Toeplitz representation of CNNs, and then utilize it to reveal the Hessian structure and, in particular, its rank. We prove tight upper bounds (with linear activations), which closely follow the empirical trend of the Hessian rank and in practice also hold for more general settings. Overall, our work generalizes and further establishes the key insight that the Hessian rank grows as the square root of the number of parameters, even in CNNs.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-singh23a, title = {The Hessian perspective into the Nature of Convolutional Neural Networks}, author = {Singh, Sidak Pal and Hofmann, Thomas and Sch\"{o}lkopf, Bernhard}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {31930--31968}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/singh23a/singh23a.pdf}, url = {https://proceedings.mlr.press/v202/singh23a.html}, abstract = {While Convolutional Neural Networks (CNNs) have long been investigated and applied, as well as theorized, we aim to provide a slightly different perspective into their nature — through the perspective of their Hessian maps. The reason is that the loss Hessian captures the pairwise interaction of parameters and therefore forms a natural ground to probe how the architectural aspects of CNNs get manifested in their structure and properties. We develop a framework relying on Toeplitz representation of CNNs, and then utilize it to reveal the Hessian structure and, in particular, its rank. We prove tight upper bounds (with linear activations), which closely follow the empirical trend of the Hessian rank and in practice also hold for more general settings. Overall, our work generalizes and further establishes the key insight that the Hessian rank grows as the square root of the number of parameters, even in CNNs.} }
Endnote
%0 Conference Paper %T The Hessian perspective into the Nature of Convolutional Neural Networks %A Sidak Pal Singh %A Thomas Hofmann %A Bernhard Schölkopf %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-singh23a %I PMLR %P 31930--31968 %U https://proceedings.mlr.press/v202/singh23a.html %V 202 %X While Convolutional Neural Networks (CNNs) have long been investigated and applied, as well as theorized, we aim to provide a slightly different perspective into their nature — through the perspective of their Hessian maps. The reason is that the loss Hessian captures the pairwise interaction of parameters and therefore forms a natural ground to probe how the architectural aspects of CNNs get manifested in their structure and properties. We develop a framework relying on Toeplitz representation of CNNs, and then utilize it to reveal the Hessian structure and, in particular, its rank. We prove tight upper bounds (with linear activations), which closely follow the empirical trend of the Hessian rank and in practice also hold for more general settings. Overall, our work generalizes and further establishes the key insight that the Hessian rank grows as the square root of the number of parameters, even in CNNs.
APA
Singh, S.P., Hofmann, T. & Schölkopf, B.. (2023). The Hessian perspective into the Nature of Convolutional Neural Networks. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:31930-31968 Available from https://proceedings.mlr.press/v202/singh23a.html.

Related Material