Is Generator Conditioning Causally Related to GAN Performance?

Augustus Odena, Jacob Buckman, Catherine Olsson, Tom Brown, Christopher Olah, Colin Raffel, Ian Goodfellow
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:3849-3858, 2018.

Abstract

Recent work suggests that controlling the entire distribution of Jacobian singular values is an important design consideration in deep learning. Motivated by this, we study the distribution of singular values of the Jacobian of the generator in Generative Adversarial Networks. We find that this Jacobian generally becomes ill-conditioned at the beginning of training. Moreover, we find that the average (across the latent space) conditioning of the generator is highly predictive of two other ad-hoc metrics for measuring the “quality” of trained GANs: the Inception Score and the Frechet Inception Distance. We then test the hypothesis that this relationship is causal by proposing a “regularization” technique (called Jacobian Clamping) that softly penalizes the condition number of the generator Jacobian. Jacobian Clamping improves the mean score for nearly all datasets on which we tested it. It also greatly reduces inter-run variance of the aforementioned scores, addressing (at least partially) one of the main criticisms of GANs.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-odena18a, title = {Is Generator Conditioning Causally Related to {GAN} Performance?}, author = {Odena, Augustus and Buckman, Jacob and Olsson, Catherine and Brown, Tom and Olah, Christopher and Raffel, Colin and Goodfellow, Ian}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {3849--3858}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/odena18a/odena18a.pdf}, url = {https://proceedings.mlr.press/v80/odena18a.html}, abstract = {Recent work suggests that controlling the entire distribution of Jacobian singular values is an important design consideration in deep learning. Motivated by this, we study the distribution of singular values of the Jacobian of the generator in Generative Adversarial Networks. We find that this Jacobian generally becomes ill-conditioned at the beginning of training. Moreover, we find that the average (across the latent space) conditioning of the generator is highly predictive of two other ad-hoc metrics for measuring the “quality” of trained GANs: the Inception Score and the Frechet Inception Distance. We then test the hypothesis that this relationship is causal by proposing a “regularization” technique (called Jacobian Clamping) that softly penalizes the condition number of the generator Jacobian. Jacobian Clamping improves the mean score for nearly all datasets on which we tested it. It also greatly reduces inter-run variance of the aforementioned scores, addressing (at least partially) one of the main criticisms of GANs.} }
Endnote
%0 Conference Paper %T Is Generator Conditioning Causally Related to GAN Performance? %A Augustus Odena %A Jacob Buckman %A Catherine Olsson %A Tom Brown %A Christopher Olah %A Colin Raffel %A Ian Goodfellow %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-odena18a %I PMLR %P 3849--3858 %U https://proceedings.mlr.press/v80/odena18a.html %V 80 %X Recent work suggests that controlling the entire distribution of Jacobian singular values is an important design consideration in deep learning. Motivated by this, we study the distribution of singular values of the Jacobian of the generator in Generative Adversarial Networks. We find that this Jacobian generally becomes ill-conditioned at the beginning of training. Moreover, we find that the average (across the latent space) conditioning of the generator is highly predictive of two other ad-hoc metrics for measuring the “quality” of trained GANs: the Inception Score and the Frechet Inception Distance. We then test the hypothesis that this relationship is causal by proposing a “regularization” technique (called Jacobian Clamping) that softly penalizes the condition number of the generator Jacobian. Jacobian Clamping improves the mean score for nearly all datasets on which we tested it. It also greatly reduces inter-run variance of the aforementioned scores, addressing (at least partially) one of the main criticisms of GANs.
APA
Odena, A., Buckman, J., Olsson, C., Brown, T., Olah, C., Raffel, C. & Goodfellow, I.. (2018). Is Generator Conditioning Causally Related to GAN Performance?. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:3849-3858 Available from https://proceedings.mlr.press/v80/odena18a.html.

Related Material