[edit]
On the Universality of Invariant Networks
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:4363-4371, 2019.
Abstract
Constraining linear layers in neural networks to respect symmetry transformations from a group G is a common design principle for invariant networks that has found many applications in machine learning. In this paper, we consider a fundamental question that has received very little attention to date: Can these networks approximate any (continuous) invariant function? We tackle the rather general case where G≤Sn (an arbitrary subgroup of the symmetric group) that acts on \Rn by permuting coordinates. This setting includes several recent popular invariant networks. We present two main results: First, G-invariant networks are universal if high-order tensors are allowed. Second, there are groups G for which higher-order tensors are unavoidable for obtaining universality. G-invariant networks consisting of only first-order tensors are of special interest due to their practical value. We conclude the paper by proving a necessary condition for the universality of G-invariant networks that incorporate only first-order tensors. Lastly, we propose a conjecture stating that this condition is also sufficient.