A Functional Perspective on Learning Symmetric Functions with Neural Networks

Aaron Zweig, Joan Bruna
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:13023-13032, 2021.

Abstract

Symmetric functions, which take as input an unordered, fixed-size set, are known to be universally representable by neural networks that enforce permutation invariance. These architectures only give guarantees for fixed input sizes, yet in many practical applications, including point clouds and particle physics, a relevant notion of generalization should include varying the input size. In this work we treat symmetric functions (of any size) as functions over probability measures, and study the learning and representation of neural networks defined on measures. By focusing on shallow architectures, we establish approximation and generalization bounds under different choices of regularization (such as RKHS and variation norms), that capture a hierarchy of functional spaces with increasing degree of non-linear learning. The resulting models can be learned efficiently and enjoy generalization guarantees that extend across input sizes, as we verify empirically.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-zweig21a, title = {A Functional Perspective on Learning Symmetric Functions with Neural Networks}, author = {Zweig, Aaron and Bruna, Joan}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {13023--13032}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/zweig21a/zweig21a.pdf}, url = {https://proceedings.mlr.press/v139/zweig21a.html}, abstract = {Symmetric functions, which take as input an unordered, fixed-size set, are known to be universally representable by neural networks that enforce permutation invariance. These architectures only give guarantees for fixed input sizes, yet in many practical applications, including point clouds and particle physics, a relevant notion of generalization should include varying the input size. In this work we treat symmetric functions (of any size) as functions over probability measures, and study the learning and representation of neural networks defined on measures. By focusing on shallow architectures, we establish approximation and generalization bounds under different choices of regularization (such as RKHS and variation norms), that capture a hierarchy of functional spaces with increasing degree of non-linear learning. The resulting models can be learned efficiently and enjoy generalization guarantees that extend across input sizes, as we verify empirically.} }
Endnote
%0 Conference Paper %T A Functional Perspective on Learning Symmetric Functions with Neural Networks %A Aaron Zweig %A Joan Bruna %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-zweig21a %I PMLR %P 13023--13032 %U https://proceedings.mlr.press/v139/zweig21a.html %V 139 %X Symmetric functions, which take as input an unordered, fixed-size set, are known to be universally representable by neural networks that enforce permutation invariance. These architectures only give guarantees for fixed input sizes, yet in many practical applications, including point clouds and particle physics, a relevant notion of generalization should include varying the input size. In this work we treat symmetric functions (of any size) as functions over probability measures, and study the learning and representation of neural networks defined on measures. By focusing on shallow architectures, we establish approximation and generalization bounds under different choices of regularization (such as RKHS and variation norms), that capture a hierarchy of functional spaces with increasing degree of non-linear learning. The resulting models can be learned efficiently and enjoy generalization guarantees that extend across input sizes, as we verify empirically.
APA
Zweig, A. & Bruna, J.. (2021). A Functional Perspective on Learning Symmetric Functions with Neural Networks. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:13023-13032 Available from https://proceedings.mlr.press/v139/zweig21a.html.

Related Material