Compositional Score Modeling for Simulation-Based Inference

Tomas Geffner, George Papamakarios, Andriy Mnih
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:11098-11116, 2023.

Abstract

Neural Posterior Estimation methods for simulation-based inference can be ill-suited for dealing with posterior distributions obtained by conditioning on multiple observations, as they tend to require a large number of simulator calls to learn accurate approximations. In contrast, Neural Likelihood Estimation methods can handle multiple observations at inference time after learning from individual observations, but they rely on standard inference methods, such as MCMC or variational inference, which come with certain performance drawbacks. We introduce a new method based on conditional score modeling that enjoys the benefits of both approaches. We model the scores of the (diffused) posterior distributions induced by individual observations, and introduce a way of combining the learned scores to approximately sample from the target posterior distribution. Our approach is sample-efficient, can naturally aggregate multiple observations at inference time, and avoids the drawbacks of standard inference methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-geffner23a, title = {Compositional Score Modeling for Simulation-Based Inference}, author = {Geffner, Tomas and Papamakarios, George and Mnih, Andriy}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {11098--11116}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/geffner23a/geffner23a.pdf}, url = {https://proceedings.mlr.press/v202/geffner23a.html}, abstract = {Neural Posterior Estimation methods for simulation-based inference can be ill-suited for dealing with posterior distributions obtained by conditioning on multiple observations, as they tend to require a large number of simulator calls to learn accurate approximations. In contrast, Neural Likelihood Estimation methods can handle multiple observations at inference time after learning from individual observations, but they rely on standard inference methods, such as MCMC or variational inference, which come with certain performance drawbacks. We introduce a new method based on conditional score modeling that enjoys the benefits of both approaches. We model the scores of the (diffused) posterior distributions induced by individual observations, and introduce a way of combining the learned scores to approximately sample from the target posterior distribution. Our approach is sample-efficient, can naturally aggregate multiple observations at inference time, and avoids the drawbacks of standard inference methods.} }
Endnote
%0 Conference Paper %T Compositional Score Modeling for Simulation-Based Inference %A Tomas Geffner %A George Papamakarios %A Andriy Mnih %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-geffner23a %I PMLR %P 11098--11116 %U https://proceedings.mlr.press/v202/geffner23a.html %V 202 %X Neural Posterior Estimation methods for simulation-based inference can be ill-suited for dealing with posterior distributions obtained by conditioning on multiple observations, as they tend to require a large number of simulator calls to learn accurate approximations. In contrast, Neural Likelihood Estimation methods can handle multiple observations at inference time after learning from individual observations, but they rely on standard inference methods, such as MCMC or variational inference, which come with certain performance drawbacks. We introduce a new method based on conditional score modeling that enjoys the benefits of both approaches. We model the scores of the (diffused) posterior distributions induced by individual observations, and introduce a way of combining the learned scores to approximately sample from the target posterior distribution. Our approach is sample-efficient, can naturally aggregate multiple observations at inference time, and avoids the drawbacks of standard inference methods.
APA
Geffner, T., Papamakarios, G. & Mnih, A.. (2023). Compositional Score Modeling for Simulation-Based Inference. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:11098-11116 Available from https://proceedings.mlr.press/v202/geffner23a.html.

Related Material