Metropolis-Hastings Generative Adversarial Networks

Ryan Turner, Jane Hung, Eric Frank, Yunus Saatchi, Jason Yosinski
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:6345-6353, 2019.

Abstract

We introduce the Metropolis-Hastings generative adversarial network (MH-GAN), which combines aspects of Markov chain Monte Carlo and GANs. The MH-GAN draws samples from the distribution implicitly defined by a GAN’s discriminator-generator pair, as opposed to standard GANs which draw samples from the distribution defined only by the generator. It uses the discriminator from GAN training to build a wrapper around the generator for improved sampling. With a perfect discriminator, this wrapped generator samples from the true distribution on the data exactly even when the generator is imperfect. We demonstrate the benefits of the improved generator on multiple benchmark datasets, including CIFAR-10 and CelebA, using the DCGAN, WGAN, and progressive GAN.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-turner19a, title = {{M}etropolis-{H}astings Generative Adversarial Networks}, author = {Turner, Ryan and Hung, Jane and Frank, Eric and Saatchi, Yunus and Yosinski, Jason}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {6345--6353}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/turner19a/turner19a.pdf}, url = {https://proceedings.mlr.press/v97/turner19a.html}, abstract = {We introduce the Metropolis-Hastings generative adversarial network (MH-GAN), which combines aspects of Markov chain Monte Carlo and GANs. The MH-GAN draws samples from the distribution implicitly defined by a GAN’s discriminator-generator pair, as opposed to standard GANs which draw samples from the distribution defined only by the generator. It uses the discriminator from GAN training to build a wrapper around the generator for improved sampling. With a perfect discriminator, this wrapped generator samples from the true distribution on the data exactly even when the generator is imperfect. We demonstrate the benefits of the improved generator on multiple benchmark datasets, including CIFAR-10 and CelebA, using the DCGAN, WGAN, and progressive GAN.} }
Endnote
%0 Conference Paper %T Metropolis-Hastings Generative Adversarial Networks %A Ryan Turner %A Jane Hung %A Eric Frank %A Yunus Saatchi %A Jason Yosinski %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-turner19a %I PMLR %P 6345--6353 %U https://proceedings.mlr.press/v97/turner19a.html %V 97 %X We introduce the Metropolis-Hastings generative adversarial network (MH-GAN), which combines aspects of Markov chain Monte Carlo and GANs. The MH-GAN draws samples from the distribution implicitly defined by a GAN’s discriminator-generator pair, as opposed to standard GANs which draw samples from the distribution defined only by the generator. It uses the discriminator from GAN training to build a wrapper around the generator for improved sampling. With a perfect discriminator, this wrapped generator samples from the true distribution on the data exactly even when the generator is imperfect. We demonstrate the benefits of the improved generator on multiple benchmark datasets, including CIFAR-10 and CelebA, using the DCGAN, WGAN, and progressive GAN.
APA
Turner, R., Hung, J., Frank, E., Saatchi, Y. & Yosinski, J.. (2019). Metropolis-Hastings Generative Adversarial Networks. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:6345-6353 Available from https://proceedings.mlr.press/v97/turner19a.html.

Related Material