Improved generalization bounds of group invariant / equivariant deep networks via quotient feature spaces

Akiyoshi Sannai, Masaaki Imaizumi, Makoto Kawano
Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence, PMLR 161:771-780, 2021.

Abstract

Numerous invariant (or equivariant) neural networks have succeeded in handling the invariant data such as point clouds and graphs. However, a generalization theory for the neural networks has not been well developed, because several essential factors for the theory, such as network size and margin distribution, are not deeply connected to the invariance and equivariance. In this study, we develop a novel generalization error bound for invariant and equivariant deep neural networks. To describe the effect of invariance and equivariance on generalization, we develop a notion of a quotient feature space, which measures the effect of group actions for the properties. Our main result proves that the volume of quotient feature spaces can describe the generalization error. Furthermore, the bound shows that the invariance and equivariance significantly improves the leading term of the bound. We apply our result to a specific invariant and equivariant networks, such as DeepSets (Zaheer et al., NIPS 2017), and show that their generalization bound is considerably improved by $\sqrt{n!}$, where $n!$ is the number of permutations. We also discuss the expressive power of invariant DNNs and show that they can achieve an optimal approximation rate. Moreover, we conducted experiments to support our results.

Cite this Paper


BibTeX
@InProceedings{pmlr-v161-sannai21a, title = {Improved generalization bounds of group invariant / equivariant deep networks via quotient feature spaces}, author = {Sannai, Akiyoshi and Imaizumi, Masaaki and Kawano, Makoto}, booktitle = {Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence}, pages = {771--780}, year = {2021}, editor = {de Campos, Cassio and Maathuis, Marloes H.}, volume = {161}, series = {Proceedings of Machine Learning Research}, month = {27--30 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v161/sannai21a/sannai21a.pdf}, url = {https://proceedings.mlr.press/v161/sannai21a.html}, abstract = {Numerous invariant (or equivariant) neural networks have succeeded in handling the invariant data such as point clouds and graphs. However, a generalization theory for the neural networks has not been well developed, because several essential factors for the theory, such as network size and margin distribution, are not deeply connected to the invariance and equivariance. In this study, we develop a novel generalization error bound for invariant and equivariant deep neural networks. To describe the effect of invariance and equivariance on generalization, we develop a notion of a quotient feature space, which measures the effect of group actions for the properties. Our main result proves that the volume of quotient feature spaces can describe the generalization error. Furthermore, the bound shows that the invariance and equivariance significantly improves the leading term of the bound. We apply our result to a specific invariant and equivariant networks, such as DeepSets (Zaheer et al., NIPS 2017), and show that their generalization bound is considerably improved by $\sqrt{n!}$, where $n!$ is the number of permutations. We also discuss the expressive power of invariant DNNs and show that they can achieve an optimal approximation rate. Moreover, we conducted experiments to support our results.} }
Endnote
%0 Conference Paper %T Improved generalization bounds of group invariant / equivariant deep networks via quotient feature spaces %A Akiyoshi Sannai %A Masaaki Imaizumi %A Makoto Kawano %B Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence %C Proceedings of Machine Learning Research %D 2021 %E Cassio de Campos %E Marloes H. Maathuis %F pmlr-v161-sannai21a %I PMLR %P 771--780 %U https://proceedings.mlr.press/v161/sannai21a.html %V 161 %X Numerous invariant (or equivariant) neural networks have succeeded in handling the invariant data such as point clouds and graphs. However, a generalization theory for the neural networks has not been well developed, because several essential factors for the theory, such as network size and margin distribution, are not deeply connected to the invariance and equivariance. In this study, we develop a novel generalization error bound for invariant and equivariant deep neural networks. To describe the effect of invariance and equivariance on generalization, we develop a notion of a quotient feature space, which measures the effect of group actions for the properties. Our main result proves that the volume of quotient feature spaces can describe the generalization error. Furthermore, the bound shows that the invariance and equivariance significantly improves the leading term of the bound. We apply our result to a specific invariant and equivariant networks, such as DeepSets (Zaheer et al., NIPS 2017), and show that their generalization bound is considerably improved by $\sqrt{n!}$, where $n!$ is the number of permutations. We also discuss the expressive power of invariant DNNs and show that they can achieve an optimal approximation rate. Moreover, we conducted experiments to support our results.
APA
Sannai, A., Imaizumi, M. & Kawano, M.. (2021). Improved generalization bounds of group invariant / equivariant deep networks via quotient feature spaces. Proceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence, in Proceedings of Machine Learning Research 161:771-780 Available from https://proceedings.mlr.press/v161/sannai21a.html.

Related Material