Bridging the Gap Between f-GANs and Wasserstein GANs

Jiaming Song, Stefano Ermon
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:9078-9087, 2020.

Abstract

Generative adversarial networks (GANs) variants approximately minimize divergences between the model and the data distribution using a discriminator. Wasserstein GANs (WGANs) enjoy superior empirical performance, however, unlike in f-GANs, the discriminator does not provide an estimate for the ratio between model and data densities, which is useful in applications such as inverse reinforcement learning. To overcome this limitation, we propose an new training objective where we additionally optimize over a set of importance weights over the generated samples. By suitably constraining the feasible set of importance weights, we obtain a family of objectives which includes and generalizes the original f-GAN and WGAN objectives. We show that a natural extension outperforms WGANs while providing density ratios as in f-GAN, and demonstrate empirical success on distribution modeling, density ratio estimation and image generation.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-song20a, title = {Bridging the Gap Between f-{GAN}s and {W}asserstein {GAN}s}, author = {Song, Jiaming and Ermon, Stefano}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {9078--9087}, year = {2020}, editor = {Hal Daumé III and Aarti Singh}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/song20a/song20a.pdf}, url = { http://proceedings.mlr.press/v119/song20a.html }, abstract = {Generative adversarial networks (GANs) variants approximately minimize divergences between the model and the data distribution using a discriminator. Wasserstein GANs (WGANs) enjoy superior empirical performance, however, unlike in f-GANs, the discriminator does not provide an estimate for the ratio between model and data densities, which is useful in applications such as inverse reinforcement learning. To overcome this limitation, we propose an new training objective where we additionally optimize over a set of importance weights over the generated samples. By suitably constraining the feasible set of importance weights, we obtain a family of objectives which includes and generalizes the original f-GAN and WGAN objectives. We show that a natural extension outperforms WGANs while providing density ratios as in f-GAN, and demonstrate empirical success on distribution modeling, density ratio estimation and image generation.} }
Endnote
%0 Conference Paper %T Bridging the Gap Between f-GANs and Wasserstein GANs %A Jiaming Song %A Stefano Ermon %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-song20a %I PMLR %P 9078--9087 %U http://proceedings.mlr.press/v119/song20a.html %V 119 %X Generative adversarial networks (GANs) variants approximately minimize divergences between the model and the data distribution using a discriminator. Wasserstein GANs (WGANs) enjoy superior empirical performance, however, unlike in f-GANs, the discriminator does not provide an estimate for the ratio between model and data densities, which is useful in applications such as inverse reinforcement learning. To overcome this limitation, we propose an new training objective where we additionally optimize over a set of importance weights over the generated samples. By suitably constraining the feasible set of importance weights, we obtain a family of objectives which includes and generalizes the original f-GAN and WGAN objectives. We show that a natural extension outperforms WGANs while providing density ratios as in f-GAN, and demonstrate empirical success on distribution modeling, density ratio estimation and image generation.
APA
Song, J. & Ermon, S.. (2020). Bridging the Gap Between f-GANs and Wasserstein GANs. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:9078-9087 Available from http://proceedings.mlr.press/v119/song20a.html .

Related Material