Adjoint Sampling: Highly Scalable Diffusion Samplers via Adjoint Matching

Aaron J Havens, Benjamin Kurt Miller, Bing Yan, Carles Domingo-Enrich, Anuroop Sriram, Daniel S. Levine, Brandon M Wood, Bin Hu, Brandon Amos, Brian Karrer, Xiang Fu, Guan-Horng Liu, Ricky T. Q. Chen
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:22204-22237, 2025.

Abstract

We introduce Adjoint Sampling, a highly scalable and efficient algorithm for learning diffusion processes that sample from unnormalized densities, or energy functions. It is the first on-policy approach that allows significantly more gradient updates than the number of energy evaluations and model samples, allowing us to scale to much larger problem settings than previously explored by similar methods. Our framework is theoretically grounded in stochastic optimal control and shares the same theoretical guarantees as Adjoint Matching, being able to train without the need for corrective measures that push samples towards the target distribution. We show how to incorporate key symmetries, as well as periodic boundary conditions, for modeling molecules in both cartesian and torsional coordinates. We demonstrate the effectiveness of our approach through extensive experiments on classical energy functions, and further scale up to neural network-based energy models where we perform amortized conformer generation across many molecular systems. To encourage further research in developing highly scalable sampling methods, we plan to open source these challenging benchmarks, where successful methods can directly impact progress in computational chemistry. Code & and benchmarks provided at https://github.com/facebookresearch/adjoint_sampling.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-havens25a, title = {Adjoint Sampling: Highly Scalable Diffusion Samplers via Adjoint Matching}, author = {Havens, Aaron J and Miller, Benjamin Kurt and Yan, Bing and Domingo-Enrich, Carles and Sriram, Anuroop and Levine, Daniel S. and Wood, Brandon M and Hu, Bin and Amos, Brandon and Karrer, Brian and Fu, Xiang and Liu, Guan-Horng and Chen, Ricky T. Q.}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {22204--22237}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/havens25a/havens25a.pdf}, url = {https://proceedings.mlr.press/v267/havens25a.html}, abstract = {We introduce Adjoint Sampling, a highly scalable and efficient algorithm for learning diffusion processes that sample from unnormalized densities, or energy functions. It is the first on-policy approach that allows significantly more gradient updates than the number of energy evaluations and model samples, allowing us to scale to much larger problem settings than previously explored by similar methods. Our framework is theoretically grounded in stochastic optimal control and shares the same theoretical guarantees as Adjoint Matching, being able to train without the need for corrective measures that push samples towards the target distribution. We show how to incorporate key symmetries, as well as periodic boundary conditions, for modeling molecules in both cartesian and torsional coordinates. We demonstrate the effectiveness of our approach through extensive experiments on classical energy functions, and further scale up to neural network-based energy models where we perform amortized conformer generation across many molecular systems. To encourage further research in developing highly scalable sampling methods, we plan to open source these challenging benchmarks, where successful methods can directly impact progress in computational chemistry. Code & and benchmarks provided at https://github.com/facebookresearch/adjoint_sampling.} }
Endnote
%0 Conference Paper %T Adjoint Sampling: Highly Scalable Diffusion Samplers via Adjoint Matching %A Aaron J Havens %A Benjamin Kurt Miller %A Bing Yan %A Carles Domingo-Enrich %A Anuroop Sriram %A Daniel S. Levine %A Brandon M Wood %A Bin Hu %A Brandon Amos %A Brian Karrer %A Xiang Fu %A Guan-Horng Liu %A Ricky T. Q. Chen %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-havens25a %I PMLR %P 22204--22237 %U https://proceedings.mlr.press/v267/havens25a.html %V 267 %X We introduce Adjoint Sampling, a highly scalable and efficient algorithm for learning diffusion processes that sample from unnormalized densities, or energy functions. It is the first on-policy approach that allows significantly more gradient updates than the number of energy evaluations and model samples, allowing us to scale to much larger problem settings than previously explored by similar methods. Our framework is theoretically grounded in stochastic optimal control and shares the same theoretical guarantees as Adjoint Matching, being able to train without the need for corrective measures that push samples towards the target distribution. We show how to incorporate key symmetries, as well as periodic boundary conditions, for modeling molecules in both cartesian and torsional coordinates. We demonstrate the effectiveness of our approach through extensive experiments on classical energy functions, and further scale up to neural network-based energy models where we perform amortized conformer generation across many molecular systems. To encourage further research in developing highly scalable sampling methods, we plan to open source these challenging benchmarks, where successful methods can directly impact progress in computational chemistry. Code & and benchmarks provided at https://github.com/facebookresearch/adjoint_sampling.
APA
Havens, A.J., Miller, B.K., Yan, B., Domingo-Enrich, C., Sriram, A., Levine, D.S., Wood, B.M., Hu, B., Amos, B., Karrer, B., Fu, X., Liu, G. & Chen, R.T.Q.. (2025). Adjoint Sampling: Highly Scalable Diffusion Samplers via Adjoint Matching. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:22204-22237 Available from https://proceedings.mlr.press/v267/havens25a.html.

Related Material