Set Norm and Equivariant Skip Connections: Putting the Deep in Deep Sets

Lily Zhang, Veronica Tozzo, John Higgins, Rajesh Ranganath
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:26559-26574, 2022.

Abstract

Permutation invariant neural networks are a promising tool for predictive modeling of set data. We show, however, that existing architectures struggle to perform well when they are deep. In this work, we mathematically and empirically analyze normalization layers and residual connections in the context of deep permutation invariant neural networks. We develop set norm, a normalization tailored for sets, and introduce the “clean path principle” for equivariant residual connections alongside a novel benefit of such connections, the reduction of information loss. Based on our analysis, we propose Deep Sets++ and Set Transformer++, deep models that reach comparable or better performance than their original counterparts on a diverse suite of tasks. We additionally introduce Flow-RBC, a new single-cell dataset and real-world application of permutation invariant prediction. We open-source our data and code here: https://github.com/rajesh-lab/deep_permutation_invariant.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-zhang22ac, title = {Set Norm and Equivariant Skip Connections: Putting the Deep in Deep Sets}, author = {Zhang, Lily and Tozzo, Veronica and Higgins, John and Ranganath, Rajesh}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {26559--26574}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/zhang22ac/zhang22ac.pdf}, url = {https://proceedings.mlr.press/v162/zhang22ac.html}, abstract = {Permutation invariant neural networks are a promising tool for predictive modeling of set data. We show, however, that existing architectures struggle to perform well when they are deep. In this work, we mathematically and empirically analyze normalization layers and residual connections in the context of deep permutation invariant neural networks. We develop set norm, a normalization tailored for sets, and introduce the “clean path principle” for equivariant residual connections alongside a novel benefit of such connections, the reduction of information loss. Based on our analysis, we propose Deep Sets++ and Set Transformer++, deep models that reach comparable or better performance than their original counterparts on a diverse suite of tasks. We additionally introduce Flow-RBC, a new single-cell dataset and real-world application of permutation invariant prediction. We open-source our data and code here: https://github.com/rajesh-lab/deep_permutation_invariant.} }
Endnote
%0 Conference Paper %T Set Norm and Equivariant Skip Connections: Putting the Deep in Deep Sets %A Lily Zhang %A Veronica Tozzo %A John Higgins %A Rajesh Ranganath %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-zhang22ac %I PMLR %P 26559--26574 %U https://proceedings.mlr.press/v162/zhang22ac.html %V 162 %X Permutation invariant neural networks are a promising tool for predictive modeling of set data. We show, however, that existing architectures struggle to perform well when they are deep. In this work, we mathematically and empirically analyze normalization layers and residual connections in the context of deep permutation invariant neural networks. We develop set norm, a normalization tailored for sets, and introduce the “clean path principle” for equivariant residual connections alongside a novel benefit of such connections, the reduction of information loss. Based on our analysis, we propose Deep Sets++ and Set Transformer++, deep models that reach comparable or better performance than their original counterparts on a diverse suite of tasks. We additionally introduce Flow-RBC, a new single-cell dataset and real-world application of permutation invariant prediction. We open-source our data and code here: https://github.com/rajesh-lab/deep_permutation_invariant.
APA
Zhang, L., Tozzo, V., Higgins, J. & Ranganath, R.. (2022). Set Norm and Equivariant Skip Connections: Putting the Deep in Deep Sets. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:26559-26574 Available from https://proceedings.mlr.press/v162/zhang22ac.html.

Related Material