Geometrically Enriched Latent Spaces

Georgios Arvanitidis, Soren Hauberg, Bernhard Schölkopf
Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, PMLR 130:631-639, 2021.

Abstract

A common assumption in generative models is that the generator immerses the latent space into a Euclidean ambient space. Instead, we consider the ambient space to be a Riemannian manifold, which allows for encoding domain knowledge through the associated Riemannian metric. Shortest paths can then be defined accordingly in the latent space to both follow the learned manifold and respect the ambient geometry. Through careful design of the ambient metric we can ensure that shortest paths are well-behaved even for deterministic generators that otherwise would exhibit a misleading bias. Experimentally we show that our approach improves the interpretability and the functionality of learned representations both using stochastic and deterministic generators.

Cite this Paper


BibTeX
@InProceedings{pmlr-v130-arvanitidis21a, title = { Geometrically Enriched Latent Spaces }, author = {Arvanitidis, Georgios and Hauberg, Soren and Sch{\"o}lkopf, Bernhard}, booktitle = {Proceedings of The 24th International Conference on Artificial Intelligence and Statistics}, pages = {631--639}, year = {2021}, editor = {Banerjee, Arindam and Fukumizu, Kenji}, volume = {130}, series = {Proceedings of Machine Learning Research}, month = {13--15 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v130/arvanitidis21a/arvanitidis21a.pdf}, url = {https://proceedings.mlr.press/v130/arvanitidis21a.html}, abstract = { A common assumption in generative models is that the generator immerses the latent space into a Euclidean ambient space. Instead, we consider the ambient space to be a Riemannian manifold, which allows for encoding domain knowledge through the associated Riemannian metric. Shortest paths can then be defined accordingly in the latent space to both follow the learned manifold and respect the ambient geometry. Through careful design of the ambient metric we can ensure that shortest paths are well-behaved even for deterministic generators that otherwise would exhibit a misleading bias. Experimentally we show that our approach improves the interpretability and the functionality of learned representations both using stochastic and deterministic generators. } }
Endnote
%0 Conference Paper %T Geometrically Enriched Latent Spaces %A Georgios Arvanitidis %A Soren Hauberg %A Bernhard Schölkopf %B Proceedings of The 24th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2021 %E Arindam Banerjee %E Kenji Fukumizu %F pmlr-v130-arvanitidis21a %I PMLR %P 631--639 %U https://proceedings.mlr.press/v130/arvanitidis21a.html %V 130 %X A common assumption in generative models is that the generator immerses the latent space into a Euclidean ambient space. Instead, we consider the ambient space to be a Riemannian manifold, which allows for encoding domain knowledge through the associated Riemannian metric. Shortest paths can then be defined accordingly in the latent space to both follow the learned manifold and respect the ambient geometry. Through careful design of the ambient metric we can ensure that shortest paths are well-behaved even for deterministic generators that otherwise would exhibit a misleading bias. Experimentally we show that our approach improves the interpretability and the functionality of learned representations both using stochastic and deterministic generators.
APA
Arvanitidis, G., Hauberg, S. & Schölkopf, B.. (2021). Geometrically Enriched Latent Spaces . Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 130:631-639 Available from https://proceedings.mlr.press/v130/arvanitidis21a.html.

Related Material