Testing Generated Distributions in GANs to Penalize Mode Collapse

Yanxiang Gong, Zhiwei Xie, Mei Xie, Xin Ma
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:442-450, 2024.

Abstract

Mode collapse remains the primary unresolved challenge within generative adversarial networks (GANs). In this work, we introduce an innovative approach that supplements the discriminator by additionally enforcing the similarity between the generated and real distributions. We implement a one-sample test on the generated samples and employ the resulting test statistic to penalize deviations from the real distribution. Our method encompasses a practical strategy to estimate distributions, compute the test statistic via a differentiable function, and seamlessly incorporate test outcomes into the training objective. Crucially, our approach preserves the convergence and theoretical integrity of GANs, as the introduced constraint represents a requisite condition for optimizing the generator training objective. Notably, our method circumvents reliance on regularization or network modules, enhancing compatibility and facilitating its practical application. Empirical evaluations on diverse public datasets validate the efficacy of our proposed approach.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-gong24a, title = {Testing Generated Distributions in GANs to Penalize Mode Collapse}, author = {Gong, Yanxiang and Xie, Zhiwei and Xie, Mei and Ma, Xin}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {442--450}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/gong24a/gong24a.pdf}, url = {https://proceedings.mlr.press/v238/gong24a.html}, abstract = {Mode collapse remains the primary unresolved challenge within generative adversarial networks (GANs). In this work, we introduce an innovative approach that supplements the discriminator by additionally enforcing the similarity between the generated and real distributions. We implement a one-sample test on the generated samples and employ the resulting test statistic to penalize deviations from the real distribution. Our method encompasses a practical strategy to estimate distributions, compute the test statistic via a differentiable function, and seamlessly incorporate test outcomes into the training objective. Crucially, our approach preserves the convergence and theoretical integrity of GANs, as the introduced constraint represents a requisite condition for optimizing the generator training objective. Notably, our method circumvents reliance on regularization or network modules, enhancing compatibility and facilitating its practical application. Empirical evaluations on diverse public datasets validate the efficacy of our proposed approach.} }
Endnote
%0 Conference Paper %T Testing Generated Distributions in GANs to Penalize Mode Collapse %A Yanxiang Gong %A Zhiwei Xie %A Mei Xie %A Xin Ma %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-gong24a %I PMLR %P 442--450 %U https://proceedings.mlr.press/v238/gong24a.html %V 238 %X Mode collapse remains the primary unresolved challenge within generative adversarial networks (GANs). In this work, we introduce an innovative approach that supplements the discriminator by additionally enforcing the similarity between the generated and real distributions. We implement a one-sample test on the generated samples and employ the resulting test statistic to penalize deviations from the real distribution. Our method encompasses a practical strategy to estimate distributions, compute the test statistic via a differentiable function, and seamlessly incorporate test outcomes into the training objective. Crucially, our approach preserves the convergence and theoretical integrity of GANs, as the introduced constraint represents a requisite condition for optimizing the generator training objective. Notably, our method circumvents reliance on regularization or network modules, enhancing compatibility and facilitating its practical application. Empirical evaluations on diverse public datasets validate the efficacy of our proposed approach.
APA
Gong, Y., Xie, Z., Xie, M. & Ma, X.. (2024). Testing Generated Distributions in GANs to Penalize Mode Collapse. Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:442-450 Available from https://proceedings.mlr.press/v238/gong24a.html.

Related Material