Learning Symmetric Embeddings for Equivariant World Models

Jung Yeon Park, Ondrej Biza, Linfeng Zhao, Jan-Willem Van De Meent, Robin Walters
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:17372-17389, 2022.

Abstract

Incorporating symmetries can lead to highly data-efficient and generalizable models by defining equivalence classes of data samples related by transformations. However, characterizing how transformations act on input data is often difficult, limiting the applicability of equivariant models. We propose learning symmetric embedding networks (SENs) that encode an input space (e.g. images), where we do not know the effect of transformations (e.g. rotations), to a feature space that transforms in a known manner under these operations. This network can be trained end-to-end with an equivariant task network to learn an explicitly symmetric representation. We validate this approach in the context of equivariant transition models with 3 distinct forms of symmetry. Our experiments demonstrate that SENs facilitate the application of equivariant networks to data with complex symmetry representations. Moreover, doing so can yield improvements in accuracy and generalization relative to both fully-equivariant and non-equivariant baselines.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-park22a, title = {Learning Symmetric Embeddings for Equivariant World Models}, author = {Park, Jung Yeon and Biza, Ondrej and Zhao, Linfeng and Van De Meent, Jan-Willem and Walters, Robin}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {17372--17389}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/park22a/park22a.pdf}, url = {https://proceedings.mlr.press/v162/park22a.html}, abstract = {Incorporating symmetries can lead to highly data-efficient and generalizable models by defining equivalence classes of data samples related by transformations. However, characterizing how transformations act on input data is often difficult, limiting the applicability of equivariant models. We propose learning symmetric embedding networks (SENs) that encode an input space (e.g. images), where we do not know the effect of transformations (e.g. rotations), to a feature space that transforms in a known manner under these operations. This network can be trained end-to-end with an equivariant task network to learn an explicitly symmetric representation. We validate this approach in the context of equivariant transition models with 3 distinct forms of symmetry. Our experiments demonstrate that SENs facilitate the application of equivariant networks to data with complex symmetry representations. Moreover, doing so can yield improvements in accuracy and generalization relative to both fully-equivariant and non-equivariant baselines.} }
Endnote
%0 Conference Paper %T Learning Symmetric Embeddings for Equivariant World Models %A Jung Yeon Park %A Ondrej Biza %A Linfeng Zhao %A Jan-Willem Van De Meent %A Robin Walters %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-park22a %I PMLR %P 17372--17389 %U https://proceedings.mlr.press/v162/park22a.html %V 162 %X Incorporating symmetries can lead to highly data-efficient and generalizable models by defining equivalence classes of data samples related by transformations. However, characterizing how transformations act on input data is often difficult, limiting the applicability of equivariant models. We propose learning symmetric embedding networks (SENs) that encode an input space (e.g. images), where we do not know the effect of transformations (e.g. rotations), to a feature space that transforms in a known manner under these operations. This network can be trained end-to-end with an equivariant task network to learn an explicitly symmetric representation. We validate this approach in the context of equivariant transition models with 3 distinct forms of symmetry. Our experiments demonstrate that SENs facilitate the application of equivariant networks to data with complex symmetry representations. Moreover, doing so can yield improvements in accuracy and generalization relative to both fully-equivariant and non-equivariant baselines.
APA
Park, J.Y., Biza, O., Zhao, L., Van De Meent, J. & Walters, R.. (2022). Learning Symmetric Embeddings for Equivariant World Models. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:17372-17389 Available from https://proceedings.mlr.press/v162/park22a.html.

Related Material