Bridging Data Gaps in Diffusion Models with Adversarial Noise-Based Transfer Learning

Xiyu Wang, Baijiong Lin, Daochang Liu, Ying-Cong Chen, Chang Xu
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:50944-50959, 2024.

Abstract

Diffusion Probabilistic Models (DPMs) show significant potential in image generation, yet their performance hinges on having access to large datasets. Previous works, like Generative Adversarial Networks (GANs), have tackled the limited data problem by transferring pre-trained models learned with sufficient data. However, those methods are hard to be utilized in DPMs since the distinct differences between DPM-based and GAN-based methods, showing in the unique iterative denoising process integral and the need for many timesteps with no-targeted noise in DPMs. In this paper, we propose a novel DPMs-based transfer learning method, ANT, to address the limited data problem. It includes two strategies: similarity-guided training, which boosts transfer with a classifier, and adversarial noise selection which adaptively chooses targeted noise based on the input image. Extensive experiments in the context of few-shot image generation tasks demonstrate that our method is not only efficient but also excels in terms of image quality and diversity when compared to existing GAN-based and DDPM-based methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-wang24ap, title = {Bridging Data Gaps in Diffusion Models with Adversarial Noise-Based Transfer Learning}, author = {Wang, Xiyu and Lin, Baijiong and Liu, Daochang and Chen, Ying-Cong and Xu, Chang}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {50944--50959}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/wang24ap/wang24ap.pdf}, url = {https://proceedings.mlr.press/v235/wang24ap.html}, abstract = {Diffusion Probabilistic Models (DPMs) show significant potential in image generation, yet their performance hinges on having access to large datasets. Previous works, like Generative Adversarial Networks (GANs), have tackled the limited data problem by transferring pre-trained models learned with sufficient data. However, those methods are hard to be utilized in DPMs since the distinct differences between DPM-based and GAN-based methods, showing in the unique iterative denoising process integral and the need for many timesteps with no-targeted noise in DPMs. In this paper, we propose a novel DPMs-based transfer learning method, ANT, to address the limited data problem. It includes two strategies: similarity-guided training, which boosts transfer with a classifier, and adversarial noise selection which adaptively chooses targeted noise based on the input image. Extensive experiments in the context of few-shot image generation tasks demonstrate that our method is not only efficient but also excels in terms of image quality and diversity when compared to existing GAN-based and DDPM-based methods.} }
Endnote
%0 Conference Paper %T Bridging Data Gaps in Diffusion Models with Adversarial Noise-Based Transfer Learning %A Xiyu Wang %A Baijiong Lin %A Daochang Liu %A Ying-Cong Chen %A Chang Xu %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-wang24ap %I PMLR %P 50944--50959 %U https://proceedings.mlr.press/v235/wang24ap.html %V 235 %X Diffusion Probabilistic Models (DPMs) show significant potential in image generation, yet their performance hinges on having access to large datasets. Previous works, like Generative Adversarial Networks (GANs), have tackled the limited data problem by transferring pre-trained models learned with sufficient data. However, those methods are hard to be utilized in DPMs since the distinct differences between DPM-based and GAN-based methods, showing in the unique iterative denoising process integral and the need for many timesteps with no-targeted noise in DPMs. In this paper, we propose a novel DPMs-based transfer learning method, ANT, to address the limited data problem. It includes two strategies: similarity-guided training, which boosts transfer with a classifier, and adversarial noise selection which adaptively chooses targeted noise based on the input image. Extensive experiments in the context of few-shot image generation tasks demonstrate that our method is not only efficient but also excels in terms of image quality and diversity when compared to existing GAN-based and DDPM-based methods.
APA
Wang, X., Lin, B., Liu, D., Chen, Y. & Xu, C.. (2024). Bridging Data Gaps in Diffusion Models with Adversarial Noise-Based Transfer Learning. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:50944-50959 Available from https://proceedings.mlr.press/v235/wang24ap.html.

Related Material