LieTransformer: Equivariant Self-Attention for Lie Groups

Michael J Hutchinson, Charline Le Lan, Sheheryar Zaidi, Emilien Dupont, Yee Whye Teh, Hyunjik Kim
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:4533-4543, 2021.

Abstract

Group equivariant neural networks are used as building blocks of group invariant neural networks, which have been shown to improve generalisation performance and data efficiency through principled parameter sharing. Such works have mostly focused on group equivariant convolutions, building on the result that group equivariant linear maps are necessarily convolutions. In this work, we extend the scope of the literature to self-attention, that is emerging as a prominent building block of deep learning models. We propose the LieTransformer, an architecture composed of LieSelfAttention layers that are equivariant to arbitrary Lie groups and their discrete subgroups. We demonstrate the generality of our approach by showing experimental results that are competitive to baseline methods on a wide range of tasks: shape counting on point clouds, molecular property regression and modelling particle trajectories under Hamiltonian dynamics.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-hutchinson21a, title = {LieTransformer: Equivariant Self-Attention for Lie Groups}, author = {Hutchinson, Michael J and Lan, Charline Le and Zaidi, Sheheryar and Dupont, Emilien and Teh, Yee Whye and Kim, Hyunjik}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {4533--4543}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/hutchinson21a/hutchinson21a.pdf}, url = {https://proceedings.mlr.press/v139/hutchinson21a.html}, abstract = {Group equivariant neural networks are used as building blocks of group invariant neural networks, which have been shown to improve generalisation performance and data efficiency through principled parameter sharing. Such works have mostly focused on group equivariant convolutions, building on the result that group equivariant linear maps are necessarily convolutions. In this work, we extend the scope of the literature to self-attention, that is emerging as a prominent building block of deep learning models. We propose the LieTransformer, an architecture composed of LieSelfAttention layers that are equivariant to arbitrary Lie groups and their discrete subgroups. We demonstrate the generality of our approach by showing experimental results that are competitive to baseline methods on a wide range of tasks: shape counting on point clouds, molecular property regression and modelling particle trajectories under Hamiltonian dynamics.} }
Endnote
%0 Conference Paper %T LieTransformer: Equivariant Self-Attention for Lie Groups %A Michael J Hutchinson %A Charline Le Lan %A Sheheryar Zaidi %A Emilien Dupont %A Yee Whye Teh %A Hyunjik Kim %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-hutchinson21a %I PMLR %P 4533--4543 %U https://proceedings.mlr.press/v139/hutchinson21a.html %V 139 %X Group equivariant neural networks are used as building blocks of group invariant neural networks, which have been shown to improve generalisation performance and data efficiency through principled parameter sharing. Such works have mostly focused on group equivariant convolutions, building on the result that group equivariant linear maps are necessarily convolutions. In this work, we extend the scope of the literature to self-attention, that is emerging as a prominent building block of deep learning models. We propose the LieTransformer, an architecture composed of LieSelfAttention layers that are equivariant to arbitrary Lie groups and their discrete subgroups. We demonstrate the generality of our approach by showing experimental results that are competitive to baseline methods on a wide range of tasks: shape counting on point clouds, molecular property regression and modelling particle trajectories under Hamiltonian dynamics.
APA
Hutchinson, M.J., Lan, C.L., Zaidi, S., Dupont, E., Teh, Y.W. & Kim, H.. (2021). LieTransformer: Equivariant Self-Attention for Lie Groups. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:4533-4543 Available from https://proceedings.mlr.press/v139/hutchinson21a.html.

Related Material