Optimizing the Latent Space of Generative Networks

Piotr Bojanowski, Armand Joulin, David Lopez-Pas, Arthur Szlam
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:600-609, 2018.

Abstract

Generative Adversarial Networks (GANs) have achieved remarkable results in the task of generating realistic natural images. In most successful applications, GAN models share two common aspects: solving a challenging saddle point optimization problem, interpreted as an adversarial game between a generator and a discriminator functions; and parameterizing the generator and the discriminator as deep convolutional neural networks. The goal of this paper is to disentangle the contribution of these two factors to the success of GANs. In particular, we introduce Generative Latent Optimization (GLO), a framework to train deep convolutional generators using simple reconstruction losses. Throughout a variety of experiments, we show that GLO enjoys many of the desirable properties of GANs: synthesizing visually-appealing samples, interpolating meaningfully between samples, and performing linear arithmetic with noise vectors; all of this without the adversarial optimization scheme.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-bojanowski18a, title = {Optimizing the Latent Space of Generative Networks}, author = {Bojanowski, Piotr and Joulin, Armand and Lopez-Pas, David and Szlam, Arthur}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {600--609}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/bojanowski18a/bojanowski18a.pdf}, url = {https://proceedings.mlr.press/v80/bojanowski18a.html}, abstract = {Generative Adversarial Networks (GANs) have achieved remarkable results in the task of generating realistic natural images. In most successful applications, GAN models share two common aspects: solving a challenging saddle point optimization problem, interpreted as an adversarial game between a generator and a discriminator functions; and parameterizing the generator and the discriminator as deep convolutional neural networks. The goal of this paper is to disentangle the contribution of these two factors to the success of GANs. In particular, we introduce Generative Latent Optimization (GLO), a framework to train deep convolutional generators using simple reconstruction losses. Throughout a variety of experiments, we show that GLO enjoys many of the desirable properties of GANs: synthesizing visually-appealing samples, interpolating meaningfully between samples, and performing linear arithmetic with noise vectors; all of this without the adversarial optimization scheme.} }
Endnote
%0 Conference Paper %T Optimizing the Latent Space of Generative Networks %A Piotr Bojanowski %A Armand Joulin %A David Lopez-Pas %A Arthur Szlam %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-bojanowski18a %I PMLR %P 600--609 %U https://proceedings.mlr.press/v80/bojanowski18a.html %V 80 %X Generative Adversarial Networks (GANs) have achieved remarkable results in the task of generating realistic natural images. In most successful applications, GAN models share two common aspects: solving a challenging saddle point optimization problem, interpreted as an adversarial game between a generator and a discriminator functions; and parameterizing the generator and the discriminator as deep convolutional neural networks. The goal of this paper is to disentangle the contribution of these two factors to the success of GANs. In particular, we introduce Generative Latent Optimization (GLO), a framework to train deep convolutional generators using simple reconstruction losses. Throughout a variety of experiments, we show that GLO enjoys many of the desirable properties of GANs: synthesizing visually-appealing samples, interpolating meaningfully between samples, and performing linear arithmetic with noise vectors; all of this without the adversarial optimization scheme.
APA
Bojanowski, P., Joulin, A., Lopez-Pas, D. & Szlam, A.. (2018). Optimizing the Latent Space of Generative Networks. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:600-609 Available from https://proceedings.mlr.press/v80/bojanowski18a.html.

Related Material