γ-ABC: Outlier-Robust Approximate Bayesian Computation Based on a Robust Divergence Estimator

Masahiro Fujisawa, Takeshi Teshima, Issei Sato, Masashi Sugiyama
Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, PMLR 130:1783-1791, 2021.

Abstract

Approximate Bayesian computation (ABC) is a likelihood-free inference method that has been employed in various applications. However, ABC can be sensitive to outliers if a data discrepancy measure is chosen inappropriately. In this paper, we propose to use a nearest-neighbor-based γ-divergence estimator as a data discrepancy measure. We show that our estimator possesses a suitable robustness property called the redescending property. In addition, our estimator enjoys various desirable properties such as high flexibility, asymptotic unbiasedness, almost sure convergence, and linear time complexity. Through experiments, we demonstrate that our method achieves significantly higher robustness than existing discrepancy measures.

Cite this Paper


BibTeX
@InProceedings{pmlr-v130-fujisawa21a, title = { γ-ABC: Outlier-Robust Approximate Bayesian Computation Based on a Robust Divergence Estimator }, author = {Fujisawa, Masahiro and Teshima, Takeshi and Sato, Issei and Sugiyama, Masashi}, booktitle = {Proceedings of The 24th International Conference on Artificial Intelligence and Statistics}, pages = {1783--1791}, year = {2021}, editor = {Banerjee, Arindam and Fukumizu, Kenji}, volume = {130}, series = {Proceedings of Machine Learning Research}, month = {13--15 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v130/fujisawa21a/fujisawa21a.pdf}, url = {https://proceedings.mlr.press/v130/fujisawa21a.html}, abstract = { Approximate Bayesian computation (ABC) is a likelihood-free inference method that has been employed in various applications. However, ABC can be sensitive to outliers if a data discrepancy measure is chosen inappropriately. In this paper, we propose to use a nearest-neighbor-based γ-divergence estimator as a data discrepancy measure. We show that our estimator possesses a suitable robustness property called the redescending property. In addition, our estimator enjoys various desirable properties such as high flexibility, asymptotic unbiasedness, almost sure convergence, and linear time complexity. Through experiments, we demonstrate that our method achieves significantly higher robustness than existing discrepancy measures. } }
Endnote
%0 Conference Paper %T γ-ABC: Outlier-Robust Approximate Bayesian Computation Based on a Robust Divergence Estimator %A Masahiro Fujisawa %A Takeshi Teshima %A Issei Sato %A Masashi Sugiyama %B Proceedings of The 24th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2021 %E Arindam Banerjee %E Kenji Fukumizu %F pmlr-v130-fujisawa21a %I PMLR %P 1783--1791 %U https://proceedings.mlr.press/v130/fujisawa21a.html %V 130 %X Approximate Bayesian computation (ABC) is a likelihood-free inference method that has been employed in various applications. However, ABC can be sensitive to outliers if a data discrepancy measure is chosen inappropriately. In this paper, we propose to use a nearest-neighbor-based γ-divergence estimator as a data discrepancy measure. We show that our estimator possesses a suitable robustness property called the redescending property. In addition, our estimator enjoys various desirable properties such as high flexibility, asymptotic unbiasedness, almost sure convergence, and linear time complexity. Through experiments, we demonstrate that our method achieves significantly higher robustness than existing discrepancy measures.
APA
Fujisawa, M., Teshima, T., Sato, I. & Sugiyama, M.. (2021). γ-ABC: Outlier-Robust Approximate Bayesian Computation Based on a Robust Divergence Estimator . Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 130:1783-1791 Available from https://proceedings.mlr.press/v130/fujisawa21a.html.

Related Material