Bagging and the Bayesian Bootstrap

Merlise Clyde, Herbert Lee
Proceedings of the Eighth International Workshop on Artificial Intelligence and Statistics, PMLR R3:57-62, 2001.

Abstract

Bagging is a method of obtaining more robust predictions when the model class under consideration is unstable with respect to the data, i.e., small changes in the data can cause the predicted values to change significantly. In this paper, we introduce a Bayesian version of bagging based on the Bayesian bootstrap. The Bayesian bootstrap resolves a theoretical problem with ordinary bagging and often results in more efficient estimators. We show how model averaging can be combined within the Bayesian bootstrap and illustrate the procedure with several examples.

Cite this Paper


BibTeX
@InProceedings{pmlr-vR3-clyde01a, title = {Bagging and the Bayesian Bootstrap}, author = {Clyde, Merlise and Lee, Herbert}, booktitle = {Proceedings of the Eighth International Workshop on Artificial Intelligence and Statistics}, pages = {57--62}, year = {2001}, editor = {Richardson, Thomas S. and Jaakkola, Tommi S.}, volume = {R3}, series = {Proceedings of Machine Learning Research}, month = {04--07 Jan}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/r3/clyde01a/clyde01a.pdf}, url = {https://proceedings.mlr.press/r3/clyde01a.html}, abstract = {Bagging is a method of obtaining more robust predictions when the model class under consideration is unstable with respect to the data, i.e., small changes in the data can cause the predicted values to change significantly. In this paper, we introduce a Bayesian version of bagging based on the Bayesian bootstrap. The Bayesian bootstrap resolves a theoretical problem with ordinary bagging and often results in more efficient estimators. We show how model averaging can be combined within the Bayesian bootstrap and illustrate the procedure with several examples.}, note = {Reissued by PMLR on 31 March 2021.} }
Endnote
%0 Conference Paper %T Bagging and the Bayesian Bootstrap %A Merlise Clyde %A Herbert Lee %B Proceedings of the Eighth International Workshop on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2001 %E Thomas S. Richardson %E Tommi S. Jaakkola %F pmlr-vR3-clyde01a %I PMLR %P 57--62 %U https://proceedings.mlr.press/r3/clyde01a.html %V R3 %X Bagging is a method of obtaining more robust predictions when the model class under consideration is unstable with respect to the data, i.e., small changes in the data can cause the predicted values to change significantly. In this paper, we introduce a Bayesian version of bagging based on the Bayesian bootstrap. The Bayesian bootstrap resolves a theoretical problem with ordinary bagging and often results in more efficient estimators. We show how model averaging can be combined within the Bayesian bootstrap and illustrate the procedure with several examples. %Z Reissued by PMLR on 31 March 2021.
APA
Clyde, M. & Lee, H.. (2001). Bagging and the Bayesian Bootstrap. Proceedings of the Eighth International Workshop on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research R3:57-62 Available from https://proceedings.mlr.press/r3/clyde01a.html. Reissued by PMLR on 31 March 2021.

Related Material