Approximate Bayesian Computation with KullbackLeibler Divergence as Data Discrepancy
[edit]
Proceedings of the TwentyFirst International Conference on Artificial Intelligence and Statistics, PMLR 84:17111721, 2018.
Abstract
Complex simulatorbased models usually have intractable likelihood functions, rendering the likelihoodbased inference methods inapplicable. Approximate Bayesian Computation (ABC) emerges as an alternative framework of likelihoodfree inference methods. It identifies a quasiposterior distribution by finding values of parameter that simulate the synthetic data resembling the observed data. A major ingredient of ABC is the discrepancy measure between the observed and the simulated data, which conventionally involves a fundamental difficulty of constructing effective summary statistics. To bypass this difficulty, we adopt a KullbackLeibler divergence estimator to assess the data discrepancy. Our method enjoys the asymptotic consistency and linearithmic time complexity as the data size increases. In experiments on five benchmark models, this method achieves a comparable or higher quasiposterior quality, compared to the existing methods using other discrepancy measures.
Related Material


