On the Limitations of First-Order Approximation in GAN Dynamics

Jerry Li, Aleksander Madry, John Peebles, Ludwig Schmidt
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:3005-3013, 2018.

Abstract

While Generative Adversarial Networks (GANs) have demonstrated promising performance on multiple vision tasks, their learning dynamics are not yet well understood, both in theory and in practice. To address this issue, we study GAN dynamics in a simple yet rich parametric model that exhibits several of the common problematic convergence behaviors such as vanishing gradients, mode collapse, and diverging or oscillatory behavior. In spite of the non-convex nature of our model, we are able to perform a rigorous theoretical analysis of its convergence behavior. Our analysis reveals an interesting dichotomy: a GAN with an optimal discriminator provably converges, while first order approximations of the discriminator steps lead to unstable GAN dynamics and mode collapse. Our result suggests that using first order discriminator steps (the de-facto standard in most existing GAN setups) might be one of the factors that makes GAN training challenging in practice.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-li18d, title = {On the Limitations of First-Order Approximation in {GAN} Dynamics}, author = {Li, Jerry and Madry, Aleksander and Peebles, John and Schmidt, Ludwig}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {3005--3013}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/li18d/li18d.pdf}, url = {https://proceedings.mlr.press/v80/li18d.html}, abstract = {While Generative Adversarial Networks (GANs) have demonstrated promising performance on multiple vision tasks, their learning dynamics are not yet well understood, both in theory and in practice. To address this issue, we study GAN dynamics in a simple yet rich parametric model that exhibits several of the common problematic convergence behaviors such as vanishing gradients, mode collapse, and diverging or oscillatory behavior. In spite of the non-convex nature of our model, we are able to perform a rigorous theoretical analysis of its convergence behavior. Our analysis reveals an interesting dichotomy: a GAN with an optimal discriminator provably converges, while first order approximations of the discriminator steps lead to unstable GAN dynamics and mode collapse. Our result suggests that using first order discriminator steps (the de-facto standard in most existing GAN setups) might be one of the factors that makes GAN training challenging in practice.} }
Endnote
%0 Conference Paper %T On the Limitations of First-Order Approximation in GAN Dynamics %A Jerry Li %A Aleksander Madry %A John Peebles %A Ludwig Schmidt %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-li18d %I PMLR %P 3005--3013 %U https://proceedings.mlr.press/v80/li18d.html %V 80 %X While Generative Adversarial Networks (GANs) have demonstrated promising performance on multiple vision tasks, their learning dynamics are not yet well understood, both in theory and in practice. To address this issue, we study GAN dynamics in a simple yet rich parametric model that exhibits several of the common problematic convergence behaviors such as vanishing gradients, mode collapse, and diverging or oscillatory behavior. In spite of the non-convex nature of our model, we are able to perform a rigorous theoretical analysis of its convergence behavior. Our analysis reveals an interesting dichotomy: a GAN with an optimal discriminator provably converges, while first order approximations of the discriminator steps lead to unstable GAN dynamics and mode collapse. Our result suggests that using first order discriminator steps (the de-facto standard in most existing GAN setups) might be one of the factors that makes GAN training challenging in practice.
APA
Li, J., Madry, A., Peebles, J. & Schmidt, L.. (2018). On the Limitations of First-Order Approximation in GAN Dynamics. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:3005-3013 Available from https://proceedings.mlr.press/v80/li18d.html.

Related Material