Improving Compositional Generation with Diffusion Models Using Lift Scores

Chenning Yu, Sicun Gao
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:72944-72971, 2025.

Abstract

We introduce a novel resampling criterion using lift scores, for improving compositional generation in diffusion models. By leveraging the lift scores, we evaluate whether generated samples align with each single condition and then compose the results to determine whether the composed prompt is satisfied. Our key insight is that lift scores can be efficiently approximated using only the original diffusion model, requiring no additional training or external modules. We develop an optimized variant that achieves relatively lower computational overhead during inference while maintaining effectiveness. Through extensive experiments, we demonstrate that lift scores significantly improved the condition alignment for compositional generation across 2D synthetic data, CLEVR position tasks, and text-to-image synthesis. Our code is available at github.com/rainorangelemon/complift.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-yu25e, title = {Improving Compositional Generation with Diffusion Models Using Lift Scores}, author = {Yu, Chenning and Gao, Sicun}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {72944--72971}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/yu25e/yu25e.pdf}, url = {https://proceedings.mlr.press/v267/yu25e.html}, abstract = {We introduce a novel resampling criterion using lift scores, for improving compositional generation in diffusion models. By leveraging the lift scores, we evaluate whether generated samples align with each single condition and then compose the results to determine whether the composed prompt is satisfied. Our key insight is that lift scores can be efficiently approximated using only the original diffusion model, requiring no additional training or external modules. We develop an optimized variant that achieves relatively lower computational overhead during inference while maintaining effectiveness. Through extensive experiments, we demonstrate that lift scores significantly improved the condition alignment for compositional generation across 2D synthetic data, CLEVR position tasks, and text-to-image synthesis. Our code is available at github.com/rainorangelemon/complift.} }
Endnote
%0 Conference Paper %T Improving Compositional Generation with Diffusion Models Using Lift Scores %A Chenning Yu %A Sicun Gao %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-yu25e %I PMLR %P 72944--72971 %U https://proceedings.mlr.press/v267/yu25e.html %V 267 %X We introduce a novel resampling criterion using lift scores, for improving compositional generation in diffusion models. By leveraging the lift scores, we evaluate whether generated samples align with each single condition and then compose the results to determine whether the composed prompt is satisfied. Our key insight is that lift scores can be efficiently approximated using only the original diffusion model, requiring no additional training or external modules. We develop an optimized variant that achieves relatively lower computational overhead during inference while maintaining effectiveness. Through extensive experiments, we demonstrate that lift scores significantly improved the condition alignment for compositional generation across 2D synthetic data, CLEVR position tasks, and text-to-image synthesis. Our code is available at github.com/rainorangelemon/complift.
APA
Yu, C. & Gao, S.. (2025). Improving Compositional Generation with Diffusion Models Using Lift Scores. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:72944-72971 Available from https://proceedings.mlr.press/v267/yu25e.html.

Related Material