Functional Space Analysis of Local GAN Convergence

Valentin Khrulkov, Artem Babenko, Ivan Oseledets
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:5432-5442, 2021.

Abstract

Recent work demonstrated the benefits of studying continuous-time dynamics governing the GAN training. However, this dynamics is analyzed in the model parameter space, which results in finite-dimensional dynamical systems. We propose a novel perspective where we study the local dynamics of adversarial training in the general functional space and show how it can be represented as a system of partial differential equations. Thus, the convergence properties can be inferred from the eigenvalues of the resulting differential operator. We show that these eigenvalues can be efficiently estimated from the target dataset before training. Our perspective reveals several insights on the practical tricks commonly used to stabilize GANs, such as gradient penalty, data augmentation, and advanced integration schemes. As an immediate practical benefit, we demonstrate how one can a priori select an optimal data augmentation strategy for a particular generation task.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-khrulkov21a, title = {Functional Space Analysis of Local GAN Convergence}, author = {Khrulkov, Valentin and Babenko, Artem and Oseledets, Ivan}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {5432--5442}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/khrulkov21a/khrulkov21a.pdf}, url = {https://proceedings.mlr.press/v139/khrulkov21a.html}, abstract = {Recent work demonstrated the benefits of studying continuous-time dynamics governing the GAN training. However, this dynamics is analyzed in the model parameter space, which results in finite-dimensional dynamical systems. We propose a novel perspective where we study the local dynamics of adversarial training in the general functional space and show how it can be represented as a system of partial differential equations. Thus, the convergence properties can be inferred from the eigenvalues of the resulting differential operator. We show that these eigenvalues can be efficiently estimated from the target dataset before training. Our perspective reveals several insights on the practical tricks commonly used to stabilize GANs, such as gradient penalty, data augmentation, and advanced integration schemes. As an immediate practical benefit, we demonstrate how one can a priori select an optimal data augmentation strategy for a particular generation task.} }
Endnote
%0 Conference Paper %T Functional Space Analysis of Local GAN Convergence %A Valentin Khrulkov %A Artem Babenko %A Ivan Oseledets %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-khrulkov21a %I PMLR %P 5432--5442 %U https://proceedings.mlr.press/v139/khrulkov21a.html %V 139 %X Recent work demonstrated the benefits of studying continuous-time dynamics governing the GAN training. However, this dynamics is analyzed in the model parameter space, which results in finite-dimensional dynamical systems. We propose a novel perspective where we study the local dynamics of adversarial training in the general functional space and show how it can be represented as a system of partial differential equations. Thus, the convergence properties can be inferred from the eigenvalues of the resulting differential operator. We show that these eigenvalues can be efficiently estimated from the target dataset before training. Our perspective reveals several insights on the practical tricks commonly used to stabilize GANs, such as gradient penalty, data augmentation, and advanced integration schemes. As an immediate practical benefit, we demonstrate how one can a priori select an optimal data augmentation strategy for a particular generation task.
APA
Khrulkov, V., Babenko, A. & Oseledets, I.. (2021). Functional Space Analysis of Local GAN Convergence. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:5432-5442 Available from https://proceedings.mlr.press/v139/khrulkov21a.html.

Related Material