Conditional diffusions for amortized neural posterior estimation

Tianyu Chen, Vansh Bansal, James G. Scott
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:2377-2385, 2025.

Abstract

Neural posterior estimation (NPE), a simulation-based computational approach for Bayesian inference, has shown great success in approximating complex posterior distributions. Existing NPE methods typically rely on normalizing flows, which approximate a distribution by composing many simple, invertible transformations. But flow-based models, while state of the art for NPE, are known to suffer from several limitations, including training instability and sharp trade-offs between representational power and computational cost. In this work, we demonstrate the effectiveness of conditional diffusions coupled with high-capacity summary networks for amortized NPE. Conditional diffusions address many of the challenges faced by flow-based methods. Our results show that, across a highly varied suite of benchmarking problems for NPE architectures, diffusions offer improved stability, superior accuracy, and faster training times, even with simpler, shallower models. Building on prior work on diffusions for NPE, we show that these gains persist across a variety of different summary network architectures. Code is available at \url{https://github.com/TianyuCodings/cDiff.}

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-chen25d, title = {Conditional diffusions for amortized neural posterior estimation}, author = {Chen, Tianyu and Bansal, Vansh and Scott, James G.}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {2377--2385}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/chen25d/chen25d.pdf}, url = {https://proceedings.mlr.press/v258/chen25d.html}, abstract = {Neural posterior estimation (NPE), a simulation-based computational approach for Bayesian inference, has shown great success in approximating complex posterior distributions. Existing NPE methods typically rely on normalizing flows, which approximate a distribution by composing many simple, invertible transformations. But flow-based models, while state of the art for NPE, are known to suffer from several limitations, including training instability and sharp trade-offs between representational power and computational cost. In this work, we demonstrate the effectiveness of conditional diffusions coupled with high-capacity summary networks for amortized NPE. Conditional diffusions address many of the challenges faced by flow-based methods. Our results show that, across a highly varied suite of benchmarking problems for NPE architectures, diffusions offer improved stability, superior accuracy, and faster training times, even with simpler, shallower models. Building on prior work on diffusions for NPE, we show that these gains persist across a variety of different summary network architectures. Code is available at \url{https://github.com/TianyuCodings/cDiff.}} }
Endnote
%0 Conference Paper %T Conditional diffusions for amortized neural posterior estimation %A Tianyu Chen %A Vansh Bansal %A James G. Scott %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-chen25d %I PMLR %P 2377--2385 %U https://proceedings.mlr.press/v258/chen25d.html %V 258 %X Neural posterior estimation (NPE), a simulation-based computational approach for Bayesian inference, has shown great success in approximating complex posterior distributions. Existing NPE methods typically rely on normalizing flows, which approximate a distribution by composing many simple, invertible transformations. But flow-based models, while state of the art for NPE, are known to suffer from several limitations, including training instability and sharp trade-offs between representational power and computational cost. In this work, we demonstrate the effectiveness of conditional diffusions coupled with high-capacity summary networks for amortized NPE. Conditional diffusions address many of the challenges faced by flow-based methods. Our results show that, across a highly varied suite of benchmarking problems for NPE architectures, diffusions offer improved stability, superior accuracy, and faster training times, even with simpler, shallower models. Building on prior work on diffusions for NPE, we show that these gains persist across a variety of different summary network architectures. Code is available at \url{https://github.com/TianyuCodings/cDiff.}
APA
Chen, T., Bansal, V. & Scott, J.G.. (2025). Conditional diffusions for amortized neural posterior estimation. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:2377-2385 Available from https://proceedings.mlr.press/v258/chen25d.html.

Related Material