Breaking isometric ties and introducing priors in Gromov-Wasserstein distances

Pinar Demetci, Quang Huy Tran, Ievgen Redko, Ritambhara Singh
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:298-306, 2024.

Abstract

Gromov-Wasserstein distance has many applications in machine learning due to its ability to compare measures across metric spaces and its invariance to isometric transformations. However, in certain applications, this invariant property can be too flexible, thus undesirable. Moreover, the Gromov-Wasserstein distance solely considers pairwise sample similarities in input datasets, disregarding the raw feature representations. We propose a new optimal transport formulation, called Augmented Gromov-Wasserstein (AGW), that allows for some control over the level of rigidity to transformations. It also incorporates feature alignments, enabling us to better leverage prior knowledge on the input data for improved performance. We first present theoretical insights into the proposed method. We then demonstrate its usefulness for single-cell multi-omic alignment tasks and heterogeneous domain adaptation in machine learning.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-demetci24a, title = {Breaking isometric ties and introducing priors in {G}romov-{W}asserstein distances}, author = {Demetci, Pinar and Huy Tran, Quang and Redko, Ievgen and Singh, Ritambhara}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {298--306}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/demetci24a/demetci24a.pdf}, url = {https://proceedings.mlr.press/v238/demetci24a.html}, abstract = {Gromov-Wasserstein distance has many applications in machine learning due to its ability to compare measures across metric spaces and its invariance to isometric transformations. However, in certain applications, this invariant property can be too flexible, thus undesirable. Moreover, the Gromov-Wasserstein distance solely considers pairwise sample similarities in input datasets, disregarding the raw feature representations. We propose a new optimal transport formulation, called Augmented Gromov-Wasserstein (AGW), that allows for some control over the level of rigidity to transformations. It also incorporates feature alignments, enabling us to better leverage prior knowledge on the input data for improved performance. We first present theoretical insights into the proposed method. We then demonstrate its usefulness for single-cell multi-omic alignment tasks and heterogeneous domain adaptation in machine learning.} }
Endnote
%0 Conference Paper %T Breaking isometric ties and introducing priors in Gromov-Wasserstein distances %A Pinar Demetci %A Quang Huy Tran %A Ievgen Redko %A Ritambhara Singh %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-demetci24a %I PMLR %P 298--306 %U https://proceedings.mlr.press/v238/demetci24a.html %V 238 %X Gromov-Wasserstein distance has many applications in machine learning due to its ability to compare measures across metric spaces and its invariance to isometric transformations. However, in certain applications, this invariant property can be too flexible, thus undesirable. Moreover, the Gromov-Wasserstein distance solely considers pairwise sample similarities in input datasets, disregarding the raw feature representations. We propose a new optimal transport formulation, called Augmented Gromov-Wasserstein (AGW), that allows for some control over the level of rigidity to transformations. It also incorporates feature alignments, enabling us to better leverage prior knowledge on the input data for improved performance. We first present theoretical insights into the proposed method. We then demonstrate its usefulness for single-cell multi-omic alignment tasks and heterogeneous domain adaptation in machine learning.
APA
Demetci, P., Huy Tran, Q., Redko, I. & Singh, R.. (2024). Breaking isometric ties and introducing priors in Gromov-Wasserstein distances. Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:298-306 Available from https://proceedings.mlr.press/v238/demetci24a.html.

Related Material