Entropic Gromov-Wasserstein between Gaussian Distributions

Khang Le, Dung Q Le, Huy Nguyen, Dat Do, Tung Pham, Nhat Ho
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:12164-12203, 2022.

Abstract

We study the entropic Gromov-Wasserstein and its unbalanced version between (unbalanced) Gaussian distributions with different dimensions. When the metric is the inner product, which we refer to as inner product Gromov-Wasserstein (IGW), we demonstrate that the optimal transportation plans of entropic IGW and its unbalanced variant are (unbalanced) Gaussian distributions. Via an application of von Neumann’s trace inequality, we obtain closed-form expressions for the entropic IGW between these Gaussian distributions. Finally, we consider an entropic inner product Gromov-Wasserstein barycenter of multiple Gaussian distributions. We prove that the barycenter is a Gaussian distribution when the entropic regularization parameter is small. We further derive a closed-form expression for the covariance matrix of the barycenter.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-le22a, title = {Entropic Gromov-{W}asserstein between {G}aussian Distributions}, author = {Le, Khang and Le, Dung Q and Nguyen, Huy and Do, Dat and Pham, Tung and Ho, Nhat}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {12164--12203}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/le22a/le22a.pdf}, url = {https://proceedings.mlr.press/v162/le22a.html}, abstract = {We study the entropic Gromov-Wasserstein and its unbalanced version between (unbalanced) Gaussian distributions with different dimensions. When the metric is the inner product, which we refer to as inner product Gromov-Wasserstein (IGW), we demonstrate that the optimal transportation plans of entropic IGW and its unbalanced variant are (unbalanced) Gaussian distributions. Via an application of von Neumann’s trace inequality, we obtain closed-form expressions for the entropic IGW between these Gaussian distributions. Finally, we consider an entropic inner product Gromov-Wasserstein barycenter of multiple Gaussian distributions. We prove that the barycenter is a Gaussian distribution when the entropic regularization parameter is small. We further derive a closed-form expression for the covariance matrix of the barycenter.} }
Endnote
%0 Conference Paper %T Entropic Gromov-Wasserstein between Gaussian Distributions %A Khang Le %A Dung Q Le %A Huy Nguyen %A Dat Do %A Tung Pham %A Nhat Ho %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-le22a %I PMLR %P 12164--12203 %U https://proceedings.mlr.press/v162/le22a.html %V 162 %X We study the entropic Gromov-Wasserstein and its unbalanced version between (unbalanced) Gaussian distributions with different dimensions. When the metric is the inner product, which we refer to as inner product Gromov-Wasserstein (IGW), we demonstrate that the optimal transportation plans of entropic IGW and its unbalanced variant are (unbalanced) Gaussian distributions. Via an application of von Neumann’s trace inequality, we obtain closed-form expressions for the entropic IGW between these Gaussian distributions. Finally, we consider an entropic inner product Gromov-Wasserstein barycenter of multiple Gaussian distributions. We prove that the barycenter is a Gaussian distribution when the entropic regularization parameter is small. We further derive a closed-form expression for the covariance matrix of the barycenter.
APA
Le, K., Le, D.Q., Nguyen, H., Do, D., Pham, T. & Ho, N.. (2022). Entropic Gromov-Wasserstein between Gaussian Distributions. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:12164-12203 Available from https://proceedings.mlr.press/v162/le22a.html.

Related Material