Generalization and Equilibrium in Generative Adversarial Nets (GANs)

Sanjeev Arora, Rong Ge, Yingyu Liang, Tengyu Ma, Yi Zhang
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:224-232, 2017.

Abstract

It is shown that training of generative adversarial network (GAN) may not have good generalization properties; e.g., training may appear successful but the trained distribution may be far from target distribution in standard metrics. However, generalization does occur for a weaker metric called neural net distance. It is also shown that an approximate pure equilibrium exists in the discriminator/generator game for a natural training objective (Wasserstein) when generator capacity and training set sizes are moderate. This existence of equilibrium inspires MIX+GAN protocol, which can be combined with any existing GAN training, and empirically shown to improve some of them.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-arora17a, title = {Generalization and Equilibrium in Generative Adversarial Nets ({GAN}s)}, author = {Sanjeev Arora and Rong Ge and Yingyu Liang and Tengyu Ma and Yi Zhang}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {224--232}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/arora17a/arora17a.pdf}, url = {https://proceedings.mlr.press/v70/arora17a.html}, abstract = {It is shown that training of generative adversarial network (GAN) may not have good generalization properties; e.g., training may appear successful but the trained distribution may be far from target distribution in standard metrics. However, generalization does occur for a weaker metric called neural net distance. It is also shown that an approximate pure equilibrium exists in the discriminator/generator game for a natural training objective (Wasserstein) when generator capacity and training set sizes are moderate. This existence of equilibrium inspires MIX+GAN protocol, which can be combined with any existing GAN training, and empirically shown to improve some of them.} }
Endnote
%0 Conference Paper %T Generalization and Equilibrium in Generative Adversarial Nets (GANs) %A Sanjeev Arora %A Rong Ge %A Yingyu Liang %A Tengyu Ma %A Yi Zhang %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-arora17a %I PMLR %P 224--232 %U https://proceedings.mlr.press/v70/arora17a.html %V 70 %X It is shown that training of generative adversarial network (GAN) may not have good generalization properties; e.g., training may appear successful but the trained distribution may be far from target distribution in standard metrics. However, generalization does occur for a weaker metric called neural net distance. It is also shown that an approximate pure equilibrium exists in the discriminator/generator game for a natural training objective (Wasserstein) when generator capacity and training set sizes are moderate. This existence of equilibrium inspires MIX+GAN protocol, which can be combined with any existing GAN training, and empirically shown to improve some of them.
APA
Arora, S., Ge, R., Liang, Y., Ma, T. & Zhang, Y.. (2017). Generalization and Equilibrium in Generative Adversarial Nets (GANs). Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:224-232 Available from https://proceedings.mlr.press/v70/arora17a.html.

Related Material