Multi-Layer Neural Networks as Trainable Ladders of Hilbert Spaces

Zhengdao Chen
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:4294-4329, 2023.

Abstract

To characterize the functions spaces explored by multi-layer neural networks (NNs), we introduce Neural Hilbert Ladders (NHLs), a collection of reproducing kernel Hilbert spaces (RKHSes) that are defined iteratively and adaptive to training. First, we prove a correspondence between functions expressed by L-layer NNs and those belonging to L-level NHLs. Second, we prove generalization guarantees for learning the NHL based on a new complexity measure. Third, corresponding to the training of multi-layer NNs in the infinite-width mean-field limit, we derive an evolution of the NHL characterized by the dynamics of multiple random fields. Finally, we examine linear and shallow NNs from the new perspective and complement the theory with numerical results.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-chen23a, title = {Multi-Layer Neural Networks as Trainable Ladders of {H}ilbert Spaces}, author = {Chen, Zhengdao}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {4294--4329}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/chen23a/chen23a.pdf}, url = {https://proceedings.mlr.press/v202/chen23a.html}, abstract = {To characterize the functions spaces explored by multi-layer neural networks (NNs), we introduce Neural Hilbert Ladders (NHLs), a collection of reproducing kernel Hilbert spaces (RKHSes) that are defined iteratively and adaptive to training. First, we prove a correspondence between functions expressed by L-layer NNs and those belonging to L-level NHLs. Second, we prove generalization guarantees for learning the NHL based on a new complexity measure. Third, corresponding to the training of multi-layer NNs in the infinite-width mean-field limit, we derive an evolution of the NHL characterized by the dynamics of multiple random fields. Finally, we examine linear and shallow NNs from the new perspective and complement the theory with numerical results.} }
Endnote
%0 Conference Paper %T Multi-Layer Neural Networks as Trainable Ladders of Hilbert Spaces %A Zhengdao Chen %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-chen23a %I PMLR %P 4294--4329 %U https://proceedings.mlr.press/v202/chen23a.html %V 202 %X To characterize the functions spaces explored by multi-layer neural networks (NNs), we introduce Neural Hilbert Ladders (NHLs), a collection of reproducing kernel Hilbert spaces (RKHSes) that are defined iteratively and adaptive to training. First, we prove a correspondence between functions expressed by L-layer NNs and those belonging to L-level NHLs. Second, we prove generalization guarantees for learning the NHL based on a new complexity measure. Third, corresponding to the training of multi-layer NNs in the infinite-width mean-field limit, we derive an evolution of the NHL characterized by the dynamics of multiple random fields. Finally, we examine linear and shallow NNs from the new perspective and complement the theory with numerical results.
APA
Chen, Z.. (2023). Multi-Layer Neural Networks as Trainable Ladders of Hilbert Spaces. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:4294-4329 Available from https://proceedings.mlr.press/v202/chen23a.html.

Related Material