Transport of Algebraic Structure to Latent Embeddings

Samuel Pfrommer, Brendon G. Anderson, Somayeh Sojoudi
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:40570-40591, 2024.

Abstract

Machine learning often aims to produce latent embeddings of inputs which lie in a larger, abstract mathematical space. For example, in the field of 3D modeling, subsets of Euclidean space can be embedded as vectors using implicit neural representations. Such subsets also have a natural algebraic structure including operations (e.g., union) and corresponding laws (e.g., associativity). How can we learn to "union" two sets using only their latent embeddings while respecting associativity? We propose a general procedure for parameterizing latent space operations that are provably consistent with the laws on the input space. This is achieved by learning a bijection from the latent space to a carefully designed mirrored algebra which is constructed on Euclidean space in accordance with desired laws. We evaluate these structural transport nets for a range of mirrored algebras against baselines that operate directly on the latent space. Our experiments provide strong evidence that respecting the underlying algebraic structure of the input space is key for learning accurate and self-consistent operations.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-pfrommer24a, title = {Transport of Algebraic Structure to Latent Embeddings}, author = {Pfrommer, Samuel and Anderson, Brendon G. and Sojoudi, Somayeh}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {40570--40591}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/pfrommer24a/pfrommer24a.pdf}, url = {https://proceedings.mlr.press/v235/pfrommer24a.html}, abstract = {Machine learning often aims to produce latent embeddings of inputs which lie in a larger, abstract mathematical space. For example, in the field of 3D modeling, subsets of Euclidean space can be embedded as vectors using implicit neural representations. Such subsets also have a natural algebraic structure including operations (e.g., union) and corresponding laws (e.g., associativity). How can we learn to "union" two sets using only their latent embeddings while respecting associativity? We propose a general procedure for parameterizing latent space operations that are provably consistent with the laws on the input space. This is achieved by learning a bijection from the latent space to a carefully designed mirrored algebra which is constructed on Euclidean space in accordance with desired laws. We evaluate these structural transport nets for a range of mirrored algebras against baselines that operate directly on the latent space. Our experiments provide strong evidence that respecting the underlying algebraic structure of the input space is key for learning accurate and self-consistent operations.} }
Endnote
%0 Conference Paper %T Transport of Algebraic Structure to Latent Embeddings %A Samuel Pfrommer %A Brendon G. Anderson %A Somayeh Sojoudi %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-pfrommer24a %I PMLR %P 40570--40591 %U https://proceedings.mlr.press/v235/pfrommer24a.html %V 235 %X Machine learning often aims to produce latent embeddings of inputs which lie in a larger, abstract mathematical space. For example, in the field of 3D modeling, subsets of Euclidean space can be embedded as vectors using implicit neural representations. Such subsets also have a natural algebraic structure including operations (e.g., union) and corresponding laws (e.g., associativity). How can we learn to "union" two sets using only their latent embeddings while respecting associativity? We propose a general procedure for parameterizing latent space operations that are provably consistent with the laws on the input space. This is achieved by learning a bijection from the latent space to a carefully designed mirrored algebra which is constructed on Euclidean space in accordance with desired laws. We evaluate these structural transport nets for a range of mirrored algebras against baselines that operate directly on the latent space. Our experiments provide strong evidence that respecting the underlying algebraic structure of the input space is key for learning accurate and self-consistent operations.
APA
Pfrommer, S., Anderson, B.G. & Sojoudi, S.. (2024). Transport of Algebraic Structure to Latent Embeddings. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:40570-40591 Available from https://proceedings.mlr.press/v235/pfrommer24a.html.

Related Material