Coreset Markov chain Monte Carlo

Naitong Chen, Trevor Campbell
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:4438-4446, 2024.

Abstract

A Bayesian coreset is a small, weighted subset of data that replaces the full dataset during inference in order to reduce computational cost. However, state of the art methods for tuning coreset weights are expensive, require nontrivial user input, and impose constraints on the model. In this work, we propose a new method—coreset MCMC—that simulates a Markov chain targeting the coreset posterior, while simultaneously updating the coreset weights using those same draws. Coreset MCMC is simple to implement and tune, and can be used with any existing MCMC kernel. We analyze coreset MCMC in a representative setting to obtain key insights about the convergence behaviour of the method. Empirical results demonstrate that coreset MCMC provides higher quality posterior approximations and reduced computational cost compared with other coreset construction methods. Further, compared with other general subsampling MCMC methods, we find that coreset MCMC has a higher sampling efficiency with competitively accurate posterior approximations.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-chen24f, title = { {C}oreset {M}arkov chain {M}onte {C}arlo }, author = {Chen, Naitong and Campbell, Trevor}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {4438--4446}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/chen24f/chen24f.pdf}, url = {https://proceedings.mlr.press/v238/chen24f.html}, abstract = { A Bayesian coreset is a small, weighted subset of data that replaces the full dataset during inference in order to reduce computational cost. However, state of the art methods for tuning coreset weights are expensive, require nontrivial user input, and impose constraints on the model. In this work, we propose a new method—coreset MCMC—that simulates a Markov chain targeting the coreset posterior, while simultaneously updating the coreset weights using those same draws. Coreset MCMC is simple to implement and tune, and can be used with any existing MCMC kernel. We analyze coreset MCMC in a representative setting to obtain key insights about the convergence behaviour of the method. Empirical results demonstrate that coreset MCMC provides higher quality posterior approximations and reduced computational cost compared with other coreset construction methods. Further, compared with other general subsampling MCMC methods, we find that coreset MCMC has a higher sampling efficiency with competitively accurate posterior approximations. } }
Endnote
%0 Conference Paper %T Coreset Markov chain Monte Carlo %A Naitong Chen %A Trevor Campbell %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-chen24f %I PMLR %P 4438--4446 %U https://proceedings.mlr.press/v238/chen24f.html %V 238 %X A Bayesian coreset is a small, weighted subset of data that replaces the full dataset during inference in order to reduce computational cost. However, state of the art methods for tuning coreset weights are expensive, require nontrivial user input, and impose constraints on the model. In this work, we propose a new method—coreset MCMC—that simulates a Markov chain targeting the coreset posterior, while simultaneously updating the coreset weights using those same draws. Coreset MCMC is simple to implement and tune, and can be used with any existing MCMC kernel. We analyze coreset MCMC in a representative setting to obtain key insights about the convergence behaviour of the method. Empirical results demonstrate that coreset MCMC provides higher quality posterior approximations and reduced computational cost compared with other coreset construction methods. Further, compared with other general subsampling MCMC methods, we find that coreset MCMC has a higher sampling efficiency with competitively accurate posterior approximations.
APA
Chen, N. & Campbell, T.. (2024). Coreset Markov chain Monte Carlo . Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:4438-4446 Available from https://proceedings.mlr.press/v238/chen24f.html.

Related Material