On the Universality of Invariant Networks

Haggai Maron, Ethan Fetaya, Nimrod Segol, Yaron Lipman
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:4363-4371, 2019.

Abstract

Constraining linear layers in neural networks to respect symmetry transformations from a group $G$ is a common design principle for invariant networks that has found many applications in machine learning. In this paper, we consider a fundamental question that has received very little attention to date: Can these networks approximate any (continuous) invariant function? We tackle the rather general case where $G\leq S_n$ (an arbitrary subgroup of the symmetric group) that acts on $\R^n$ by permuting coordinates. This setting includes several recent popular invariant networks. We present two main results: First, $G$-invariant networks are universal if high-order tensors are allowed. Second, there are groups $G$ for which higher-order tensors are unavoidable for obtaining universality. $G$-invariant networks consisting of only first-order tensors are of special interest due to their practical value. We conclude the paper by proving a necessary condition for the universality of $G$-invariant networks that incorporate only first-order tensors. Lastly, we propose a conjecture stating that this condition is also sufficient.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-maron19a, title = {On the Universality of Invariant Networks}, author = {Maron, Haggai and Fetaya, Ethan and Segol, Nimrod and Lipman, Yaron}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {4363--4371}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/maron19a/maron19a.pdf}, url = {https://proceedings.mlr.press/v97/maron19a.html}, abstract = {Constraining linear layers in neural networks to respect symmetry transformations from a group $G$ is a common design principle for invariant networks that has found many applications in machine learning. In this paper, we consider a fundamental question that has received very little attention to date: Can these networks approximate any (continuous) invariant function? We tackle the rather general case where $G\leq S_n$ (an arbitrary subgroup of the symmetric group) that acts on $\R^n$ by permuting coordinates. This setting includes several recent popular invariant networks. We present two main results: First, $G$-invariant networks are universal if high-order tensors are allowed. Second, there are groups $G$ for which higher-order tensors are unavoidable for obtaining universality. $G$-invariant networks consisting of only first-order tensors are of special interest due to their practical value. We conclude the paper by proving a necessary condition for the universality of $G$-invariant networks that incorporate only first-order tensors. Lastly, we propose a conjecture stating that this condition is also sufficient.} }
Endnote
%0 Conference Paper %T On the Universality of Invariant Networks %A Haggai Maron %A Ethan Fetaya %A Nimrod Segol %A Yaron Lipman %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-maron19a %I PMLR %P 4363--4371 %U https://proceedings.mlr.press/v97/maron19a.html %V 97 %X Constraining linear layers in neural networks to respect symmetry transformations from a group $G$ is a common design principle for invariant networks that has found many applications in machine learning. In this paper, we consider a fundamental question that has received very little attention to date: Can these networks approximate any (continuous) invariant function? We tackle the rather general case where $G\leq S_n$ (an arbitrary subgroup of the symmetric group) that acts on $\R^n$ by permuting coordinates. This setting includes several recent popular invariant networks. We present two main results: First, $G$-invariant networks are universal if high-order tensors are allowed. Second, there are groups $G$ for which higher-order tensors are unavoidable for obtaining universality. $G$-invariant networks consisting of only first-order tensors are of special interest due to their practical value. We conclude the paper by proving a necessary condition for the universality of $G$-invariant networks that incorporate only first-order tensors. Lastly, we propose a conjecture stating that this condition is also sufficient.
APA
Maron, H., Fetaya, E., Segol, N. & Lipman, Y.. (2019). On the Universality of Invariant Networks. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:4363-4371 Available from https://proceedings.mlr.press/v97/maron19a.html.

Related Material