Euclidean, Projective, Conformal: Choosing a Geometric Algebra for Equivariant Transformers

Pim de Haan, Taco Cohen, Johann Brehmer
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:3088-3096, 2024.

Abstract

The Geometric Algebra Transformer (GATr) is a versatile architecture for geometric deep learning based on projective geometric algebra. We generalize this architecture into a blueprint that allows one to construct a scalable transformer architecture given any geometric (or Clifford) algebra. We study versions of this architecture for Euclidean, projective, and conformal algebras, all of which are suited to represent 3D data, and evaluate them in theory and practice. The simplest Euclidean architecture is computationally cheap, but has a smaller symmetry group and is not as sample-efficient, while the projective model is not sufficiently expressive. Both the conformal algebra and an improved version of the projective algebra define powerful, performant architectures.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-haan24a, title = {Euclidean, Projective, Conformal: Choosing a Geometric Algebra for Equivariant Transformers}, author = {de Haan, Pim and Cohen, Taco and Brehmer, Johann}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {3088--3096}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/haan24a/haan24a.pdf}, url = {https://proceedings.mlr.press/v238/haan24a.html}, abstract = {The Geometric Algebra Transformer (GATr) is a versatile architecture for geometric deep learning based on projective geometric algebra. We generalize this architecture into a blueprint that allows one to construct a scalable transformer architecture given any geometric (or Clifford) algebra. We study versions of this architecture for Euclidean, projective, and conformal algebras, all of which are suited to represent 3D data, and evaluate them in theory and practice. The simplest Euclidean architecture is computationally cheap, but has a smaller symmetry group and is not as sample-efficient, while the projective model is not sufficiently expressive. Both the conformal algebra and an improved version of the projective algebra define powerful, performant architectures.} }
Endnote
%0 Conference Paper %T Euclidean, Projective, Conformal: Choosing a Geometric Algebra for Equivariant Transformers %A Pim de Haan %A Taco Cohen %A Johann Brehmer %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-haan24a %I PMLR %P 3088--3096 %U https://proceedings.mlr.press/v238/haan24a.html %V 238 %X The Geometric Algebra Transformer (GATr) is a versatile architecture for geometric deep learning based on projective geometric algebra. We generalize this architecture into a blueprint that allows one to construct a scalable transformer architecture given any geometric (or Clifford) algebra. We study versions of this architecture for Euclidean, projective, and conformal algebras, all of which are suited to represent 3D data, and evaluate them in theory and practice. The simplest Euclidean architecture is computationally cheap, but has a smaller symmetry group and is not as sample-efficient, while the projective model is not sufficiently expressive. Both the conformal algebra and an improved version of the projective algebra define powerful, performant architectures.
APA
de Haan, P., Cohen, T. & Brehmer, J.. (2024). Euclidean, Projective, Conformal: Choosing a Geometric Algebra for Equivariant Transformers. Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:3088-3096 Available from https://proceedings.mlr.press/v238/haan24a.html.

Related Material