Regularized Optimal Transport is Ground Cost Adversarial

François-Pierre Paty, Marco Cuturi
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:7532-7542, 2020.

Abstract

Regularizing the optimal transport (OT) problem has proven crucial for OT theory to impact the field of machine learning. For instance, it is known that regularizing OT problems with entropy leads to faster computations and better differentiation using the Sinkhorn algorithm, as well as better sample complexity bounds than classic OT. In this work we depart from this practical perspective and propose a new interpretation of regularization as a robust mechanism, and show using Fenchel duality that any convex regularization of OT can be interpreted as ground cost adversarial. This incidentally gives access to a robust dissimilarity measure on the ground space, which can in turn be used in other applications. We propose algorithms to compute this robust cost, and illustrate the interest of this approach empirically.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-paty20a, title = {Regularized Optimal Transport is Ground Cost Adversarial}, author = {Paty, Fran{\c{c}}ois-Pierre and Cuturi, Marco}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {7532--7542}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/paty20a/paty20a.pdf}, url = {https://proceedings.mlr.press/v119/paty20a.html}, abstract = {Regularizing the optimal transport (OT) problem has proven crucial for OT theory to impact the field of machine learning. For instance, it is known that regularizing OT problems with entropy leads to faster computations and better differentiation using the Sinkhorn algorithm, as well as better sample complexity bounds than classic OT. In this work we depart from this practical perspective and propose a new interpretation of regularization as a robust mechanism, and show using Fenchel duality that any convex regularization of OT can be interpreted as ground cost adversarial. This incidentally gives access to a robust dissimilarity measure on the ground space, which can in turn be used in other applications. We propose algorithms to compute this robust cost, and illustrate the interest of this approach empirically.} }
Endnote
%0 Conference Paper %T Regularized Optimal Transport is Ground Cost Adversarial %A François-Pierre Paty %A Marco Cuturi %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-paty20a %I PMLR %P 7532--7542 %U https://proceedings.mlr.press/v119/paty20a.html %V 119 %X Regularizing the optimal transport (OT) problem has proven crucial for OT theory to impact the field of machine learning. For instance, it is known that regularizing OT problems with entropy leads to faster computations and better differentiation using the Sinkhorn algorithm, as well as better sample complexity bounds than classic OT. In this work we depart from this practical perspective and propose a new interpretation of regularization as a robust mechanism, and show using Fenchel duality that any convex regularization of OT can be interpreted as ground cost adversarial. This incidentally gives access to a robust dissimilarity measure on the ground space, which can in turn be used in other applications. We propose algorithms to compute this robust cost, and illustrate the interest of this approach empirically.
APA
Paty, F. & Cuturi, M.. (2020). Regularized Optimal Transport is Ground Cost Adversarial. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:7532-7542 Available from https://proceedings.mlr.press/v119/paty20a.html.

Related Material