Approximate Bayesian Computation with Kullback-Leibler Divergence as Data Discrepancy


Bai Jiang ;
Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, PMLR 84:1711-1721, 2018.


Complex simulator-based models usually have intractable likelihood functions, rendering the likelihood-based inference methods inapplicable. Approximate Bayesian Computation (ABC) emerges as an alternative framework of likelihood-free inference methods. It identifies a quasi-posterior distribution by finding values of parameter that simulate the synthetic data resembling the observed data. A major ingredient of ABC is the discrepancy measure between the observed and the simulated data, which conventionally involves a fundamental difficulty of constructing effective summary statistics. To bypass this difficulty, we adopt a Kullback-Leibler divergence estimator to assess the data discrepancy. Our method enjoys the asymptotic consistency and linearithmic time complexity as the data size increases. In experiments on five benchmark models, this method achieves a comparable or higher quasi-posterior quality, compared to the existing methods using other discrepancy measures.

Related Material