MMDBayes: Robust Bayesian Estimation via Maximum Mean Discrepancy
[edit]
Proceedings of The 2nd Symposium on
Advances in Approximate Bayesian Inference, PMLR 118:121, 2020.
Abstract
In some misspecied settings, the posterior distribution in Bayesian statistics may lead to inconsistent estimates. To x this issue, it has been suggested to replace the likelihood by a pseudolikelihood, that is the exponential of a loss function enjoying suitable robustness properties. In this paper, we build a pseudolikelihood based on the Maximum Mean
Discrepancy, dened via an embedding of probability distributions into a reproducing kernel Hilbert space. We show that this MMDBayes posterior is consistent and robust to
model misspecication. As the posterior obtained in this way might be intractable, we also prove that reasonable variational approximations of this posterior enjoy the same properties. We provide details on a stochastic gradient algorithm to compute these variational approximations. Numerical simulations indeed suggest that our estimator is more robust to misspecication than the ones based on the likelihood.
Related Material


