MS$^3$D: A RG Flow-Based Regularization for GAN Training with Limited Data

Jian Wang, Xin Lan, Yuxin Tian, Jiancheng Lv
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:50746-50765, 2024.

Abstract

Generative adversarial networks (GANs) have made impressive advances in image generation, but they often require large-scale training data to avoid degradation caused by discriminator overfitting. To tackle this issue, we investigate the challenge of training GANs with limited data, and propose a novel regularization method based on the idea of renormalization group (RG) in physics.We observe that in the limited data setting, the gradient pattern that the generator obtains from the discriminator becomes more aggregated over time. In RG context, this aggregated pattern exhibits a high discrepancy from its coarse-grained versions, which implies a high-capacity and sensitive system, prone to overfitting and collapse. To address this problem, we introduce a multi-scale structural self-dissimilarity (MS$^3$D) regularization, which constrains the gradient field to have a consistent pattern across different scales, thereby fostering a more redundant and robust system. We show that our method can effectively enhance the performance and stability of GANs under limited data scenarios, and even allow them to generate high-quality images with very few data.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-wang24af, title = {{MS}$^3$D: A {RG} Flow-Based Regularization for {GAN} Training with Limited Data}, author = {Wang, Jian and Lan, Xin and Tian, Yuxin and Lv, Jiancheng}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {50746--50765}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/wang24af/wang24af.pdf}, url = {https://proceedings.mlr.press/v235/wang24af.html}, abstract = {Generative adversarial networks (GANs) have made impressive advances in image generation, but they often require large-scale training data to avoid degradation caused by discriminator overfitting. To tackle this issue, we investigate the challenge of training GANs with limited data, and propose a novel regularization method based on the idea of renormalization group (RG) in physics.We observe that in the limited data setting, the gradient pattern that the generator obtains from the discriminator becomes more aggregated over time. In RG context, this aggregated pattern exhibits a high discrepancy from its coarse-grained versions, which implies a high-capacity and sensitive system, prone to overfitting and collapse. To address this problem, we introduce a multi-scale structural self-dissimilarity (MS$^3$D) regularization, which constrains the gradient field to have a consistent pattern across different scales, thereby fostering a more redundant and robust system. We show that our method can effectively enhance the performance and stability of GANs under limited data scenarios, and even allow them to generate high-quality images with very few data.} }
Endnote
%0 Conference Paper %T MS$^3$D: A RG Flow-Based Regularization for GAN Training with Limited Data %A Jian Wang %A Xin Lan %A Yuxin Tian %A Jiancheng Lv %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-wang24af %I PMLR %P 50746--50765 %U https://proceedings.mlr.press/v235/wang24af.html %V 235 %X Generative adversarial networks (GANs) have made impressive advances in image generation, but they often require large-scale training data to avoid degradation caused by discriminator overfitting. To tackle this issue, we investigate the challenge of training GANs with limited data, and propose a novel regularization method based on the idea of renormalization group (RG) in physics.We observe that in the limited data setting, the gradient pattern that the generator obtains from the discriminator becomes more aggregated over time. In RG context, this aggregated pattern exhibits a high discrepancy from its coarse-grained versions, which implies a high-capacity and sensitive system, prone to overfitting and collapse. To address this problem, we introduce a multi-scale structural self-dissimilarity (MS$^3$D) regularization, which constrains the gradient field to have a consistent pattern across different scales, thereby fostering a more redundant and robust system. We show that our method can effectively enhance the performance and stability of GANs under limited data scenarios, and even allow them to generate high-quality images with very few data.
APA
Wang, J., Lan, X., Tian, Y. & Lv, J.. (2024). MS$^3$D: A RG Flow-Based Regularization for GAN Training with Limited Data. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:50746-50765 Available from https://proceedings.mlr.press/v235/wang24af.html.

Related Material