Wasserstein of Wasserstein Loss for Learning Generative Models

Yonatan Dukler, Wuchen Li, Alex Lin, Guido Montufar
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:1716-1725, 2019.

Abstract

The Wasserstein distance serves as a loss function for unsupervised learning which depends on the choice of a ground metric on sample space. We propose to use the Wasserstein distance itself as the ground metric on the sample space of images. This ground metric is known as an effective distance for image retrieval, that correlates with human perception. We derive the Wasserstein ground metric on pixel space and define a Riemannian Wasserstein gradient penalty to be used in the Wasserstein Generative Adversarial Network (WGAN) framework. The new gradient penalty is computed efficiently via convolutions on the $L^2$ gradients with negligible additional computational cost. The new formulation is more robust to the natural variability of the data and provides for a more continuous discriminator in sample space.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-dukler19a, title = {{W}asserstein of {W}asserstein Loss for Learning Generative Models}, author = {Dukler, Yonatan and Li, Wuchen and Lin, Alex and Montufar, Guido}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {1716--1725}, year = {2019}, editor = {Kamalika Chaudhuri and Ruslan Salakhutdinov}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/dukler19a/dukler19a.pdf}, url = { http://proceedings.mlr.press/v97/dukler19a.html }, abstract = {The Wasserstein distance serves as a loss function for unsupervised learning which depends on the choice of a ground metric on sample space. We propose to use the Wasserstein distance itself as the ground metric on the sample space of images. This ground metric is known as an effective distance for image retrieval, that correlates with human perception. We derive the Wasserstein ground metric on pixel space and define a Riemannian Wasserstein gradient penalty to be used in the Wasserstein Generative Adversarial Network (WGAN) framework. The new gradient penalty is computed efficiently via convolutions on the $L^2$ gradients with negligible additional computational cost. The new formulation is more robust to the natural variability of the data and provides for a more continuous discriminator in sample space.} }
Endnote
%0 Conference Paper %T Wasserstein of Wasserstein Loss for Learning Generative Models %A Yonatan Dukler %A Wuchen Li %A Alex Lin %A Guido Montufar %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-dukler19a %I PMLR %P 1716--1725 %U http://proceedings.mlr.press/v97/dukler19a.html %V 97 %X The Wasserstein distance serves as a loss function for unsupervised learning which depends on the choice of a ground metric on sample space. We propose to use the Wasserstein distance itself as the ground metric on the sample space of images. This ground metric is known as an effective distance for image retrieval, that correlates with human perception. We derive the Wasserstein ground metric on pixel space and define a Riemannian Wasserstein gradient penalty to be used in the Wasserstein Generative Adversarial Network (WGAN) framework. The new gradient penalty is computed efficiently via convolutions on the $L^2$ gradients with negligible additional computational cost. The new formulation is more robust to the natural variability of the data and provides for a more continuous discriminator in sample space.
APA
Dukler, Y., Li, W., Lin, A. & Montufar, G.. (2019). Wasserstein of Wasserstein Loss for Learning Generative Models. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:1716-1725 Available from http://proceedings.mlr.press/v97/dukler19a.html .

Related Material