Approximate Bayesian Computation with Kullback-Leibler Divergence as Data Discrepancy

Bai Jiang
Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, PMLR 84:1711-1721, 2018.

Abstract

Complex simulator-based models usually have intractable likelihood functions, rendering the likelihood-based inference methods inapplicable. Approximate Bayesian Computation (ABC) emerges as an alternative framework of likelihood-free inference methods. It identifies a quasi-posterior distribution by finding values of parameter that simulate the synthetic data resembling the observed data. A major ingredient of ABC is the discrepancy measure between the observed and the simulated data, which conventionally involves a fundamental difficulty of constructing effective summary statistics. To bypass this difficulty, we adopt a Kullback-Leibler divergence estimator to assess the data discrepancy. Our method enjoys the asymptotic consistency and linearithmic time complexity as the data size increases. In experiments on five benchmark models, this method achieves a comparable or higher quasi-posterior quality, compared to the existing methods using other discrepancy measures.

Cite this Paper


BibTeX
@InProceedings{pmlr-v84-jiang18a, title = {Approximate Bayesian Computation with Kullback-Leibler Divergence as Data Discrepancy}, author = {Jiang, Bai}, booktitle = {Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics}, pages = {1711--1721}, year = {2018}, editor = {Storkey, Amos and Perez-Cruz, Fernando}, volume = {84}, series = {Proceedings of Machine Learning Research}, month = {09--11 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v84/jiang18a/jiang18a.pdf}, url = {https://proceedings.mlr.press/v84/jiang18a.html}, abstract = {Complex simulator-based models usually have intractable likelihood functions, rendering the likelihood-based inference methods inapplicable. Approximate Bayesian Computation (ABC) emerges as an alternative framework of likelihood-free inference methods. It identifies a quasi-posterior distribution by finding values of parameter that simulate the synthetic data resembling the observed data. A major ingredient of ABC is the discrepancy measure between the observed and the simulated data, which conventionally involves a fundamental difficulty of constructing effective summary statistics. To bypass this difficulty, we adopt a Kullback-Leibler divergence estimator to assess the data discrepancy. Our method enjoys the asymptotic consistency and linearithmic time complexity as the data size increases. In experiments on five benchmark models, this method achieves a comparable or higher quasi-posterior quality, compared to the existing methods using other discrepancy measures.} }
Endnote
%0 Conference Paper %T Approximate Bayesian Computation with Kullback-Leibler Divergence as Data Discrepancy %A Bai Jiang %B Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2018 %E Amos Storkey %E Fernando Perez-Cruz %F pmlr-v84-jiang18a %I PMLR %P 1711--1721 %U https://proceedings.mlr.press/v84/jiang18a.html %V 84 %X Complex simulator-based models usually have intractable likelihood functions, rendering the likelihood-based inference methods inapplicable. Approximate Bayesian Computation (ABC) emerges as an alternative framework of likelihood-free inference methods. It identifies a quasi-posterior distribution by finding values of parameter that simulate the synthetic data resembling the observed data. A major ingredient of ABC is the discrepancy measure between the observed and the simulated data, which conventionally involves a fundamental difficulty of constructing effective summary statistics. To bypass this difficulty, we adopt a Kullback-Leibler divergence estimator to assess the data discrepancy. Our method enjoys the asymptotic consistency and linearithmic time complexity as the data size increases. In experiments on five benchmark models, this method achieves a comparable or higher quasi-posterior quality, compared to the existing methods using other discrepancy measures.
APA
Jiang, B.. (2018). Approximate Bayesian Computation with Kullback-Leibler Divergence as Data Discrepancy. Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 84:1711-1721 Available from https://proceedings.mlr.press/v84/jiang18a.html.

Related Material