Wasserstein Generative Adversarial Networks

Martin Arjovsky, Soumith Chintala, Léon Bottou
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:214-223, 2017.

Abstract

We introduce a new algorithm named WGAN, an alternative to traditional GAN training. In this new model, we show that we can improve the stability of learning, get rid of problems like mode collapse, and provide meaningful learning curves useful for debugging and hyperparameter searches. Furthermore, we show that the corresponding optimization problem is sound, and provide extensive theoretical work highlighting the deep connections to different distances between distributions.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-arjovsky17a, title = {{W}asserstein Generative Adversarial Networks}, author = {Martin Arjovsky and Soumith Chintala and L{\'e}on Bottou}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {214--223}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/arjovsky17a/arjovsky17a.pdf}, url = {https://proceedings.mlr.press/v70/arjovsky17a.html}, abstract = {We introduce a new algorithm named WGAN, an alternative to traditional GAN training. In this new model, we show that we can improve the stability of learning, get rid of problems like mode collapse, and provide meaningful learning curves useful for debugging and hyperparameter searches. Furthermore, we show that the corresponding optimization problem is sound, and provide extensive theoretical work highlighting the deep connections to different distances between distributions.} }
Endnote
%0 Conference Paper %T Wasserstein Generative Adversarial Networks %A Martin Arjovsky %A Soumith Chintala %A Léon Bottou %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-arjovsky17a %I PMLR %P 214--223 %U https://proceedings.mlr.press/v70/arjovsky17a.html %V 70 %X We introduce a new algorithm named WGAN, an alternative to traditional GAN training. In this new model, we show that we can improve the stability of learning, get rid of problems like mode collapse, and provide meaningful learning curves useful for debugging and hyperparameter searches. Furthermore, we show that the corresponding optimization problem is sound, and provide extensive theoretical work highlighting the deep connections to different distances between distributions.
APA
Arjovsky, M., Chintala, S. & Bottou, L.. (2017). Wasserstein Generative Adversarial Networks. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:214-223 Available from https://proceedings.mlr.press/v70/arjovsky17a.html.

Related Material