Neural Empirical Bayes: Source Distribution Estimation and its Applications to Simulation-Based Inference

Maxime Vandegar, Michael Kagan, Antoine Wehenkel, Gilles Louppe
Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, PMLR 130:2107-2115, 2021.

Abstract

We revisit g-modeling empirical Bayes in the absence of a tractable likelihood function, as is typical in scientific domains relying on computer simulations. We investigate how the empirical Bayesian can make use of neural density estimators first to use all noise-corrupted observations to estimate a prior or source distribution over uncorrupted samples, and then to perform single-observation posterior inference using the fitted source distribution. We propose an approach based on the direct maximization of the log-marginal likelihood of the observations, examining both biased and de-biased estimators, and comparing to variational approaches. We find that, up to symmetries, a neural empirical Bayes approach recovers ground truth source distributions. With the learned source distribution in hand, we show the applicability to likelihood-free inference and examine the quality of the resulting posterior estimates. Finally, we demonstrate the applicability of Neural Empirical Bayes on an inverse problem from collider physics.

Cite this Paper


BibTeX
@InProceedings{pmlr-v130-vandegar21a, title = { Neural Empirical Bayes: Source Distribution Estimation and its Applications to Simulation-Based Inference }, author = {Vandegar, Maxime and Kagan, Michael and Wehenkel, Antoine and Louppe, Gilles}, booktitle = {Proceedings of The 24th International Conference on Artificial Intelligence and Statistics}, pages = {2107--2115}, year = {2021}, editor = {Banerjee, Arindam and Fukumizu, Kenji}, volume = {130}, series = {Proceedings of Machine Learning Research}, month = {13--15 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v130/vandegar21a/vandegar21a.pdf}, url = {https://proceedings.mlr.press/v130/vandegar21a.html}, abstract = { We revisit g-modeling empirical Bayes in the absence of a tractable likelihood function, as is typical in scientific domains relying on computer simulations. We investigate how the empirical Bayesian can make use of neural density estimators first to use all noise-corrupted observations to estimate a prior or source distribution over uncorrupted samples, and then to perform single-observation posterior inference using the fitted source distribution. We propose an approach based on the direct maximization of the log-marginal likelihood of the observations, examining both biased and de-biased estimators, and comparing to variational approaches. We find that, up to symmetries, a neural empirical Bayes approach recovers ground truth source distributions. With the learned source distribution in hand, we show the applicability to likelihood-free inference and examine the quality of the resulting posterior estimates. Finally, we demonstrate the applicability of Neural Empirical Bayes on an inverse problem from collider physics. } }
Endnote
%0 Conference Paper %T Neural Empirical Bayes: Source Distribution Estimation and its Applications to Simulation-Based Inference %A Maxime Vandegar %A Michael Kagan %A Antoine Wehenkel %A Gilles Louppe %B Proceedings of The 24th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2021 %E Arindam Banerjee %E Kenji Fukumizu %F pmlr-v130-vandegar21a %I PMLR %P 2107--2115 %U https://proceedings.mlr.press/v130/vandegar21a.html %V 130 %X We revisit g-modeling empirical Bayes in the absence of a tractable likelihood function, as is typical in scientific domains relying on computer simulations. We investigate how the empirical Bayesian can make use of neural density estimators first to use all noise-corrupted observations to estimate a prior or source distribution over uncorrupted samples, and then to perform single-observation posterior inference using the fitted source distribution. We propose an approach based on the direct maximization of the log-marginal likelihood of the observations, examining both biased and de-biased estimators, and comparing to variational approaches. We find that, up to symmetries, a neural empirical Bayes approach recovers ground truth source distributions. With the learned source distribution in hand, we show the applicability to likelihood-free inference and examine the quality of the resulting posterior estimates. Finally, we demonstrate the applicability of Neural Empirical Bayes on an inverse problem from collider physics.
APA
Vandegar, M., Kagan, M., Wehenkel, A. & Louppe, G.. (2021). Neural Empirical Bayes: Source Distribution Estimation and its Applications to Simulation-Based Inference . Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 130:2107-2115 Available from https://proceedings.mlr.press/v130/vandegar21a.html.

Related Material