Centroid Approximation for Bootstrap: Improving Particle Quality at Inference

Mao Ye, Qiang Liu
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:25469-25489, 2022.

Abstract

Bootstrap is a principled and powerful frequentist statistical tool for uncertainty quantification. Unfortunately, standard bootstrap methods are computationally intensive due to the need of drawing a large i.i.d. bootstrap sample to approximate the ideal bootstrap distribution; this largely hinders their application in large-scale machine learning, especially deep learning problems. In this work, we propose an efficient method to explicitly optimize a small set of high quality “centroid” points to better approximate the ideal bootstrap distribution. We achieve this by minimizing a simple objective function that is asymptotically equivalent to the Wasserstein distance to the ideal bootstrap distribution. This allows us to provide an accurate estimation of uncertainty with a small number of bootstrap centroids, outperforming the naive i.i.d. sampling approach. Empirically, we show that our method can boost the performance of bootstrap in a variety of applications.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-ye22a, title = {Centroid Approximation for Bootstrap: Improving Particle Quality at Inference}, author = {Ye, Mao and Liu, Qiang}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {25469--25489}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/ye22a/ye22a.pdf}, url = {https://proceedings.mlr.press/v162/ye22a.html}, abstract = {Bootstrap is a principled and powerful frequentist statistical tool for uncertainty quantification. Unfortunately, standard bootstrap methods are computationally intensive due to the need of drawing a large i.i.d. bootstrap sample to approximate the ideal bootstrap distribution; this largely hinders their application in large-scale machine learning, especially deep learning problems. In this work, we propose an efficient method to explicitly optimize a small set of high quality “centroid” points to better approximate the ideal bootstrap distribution. We achieve this by minimizing a simple objective function that is asymptotically equivalent to the Wasserstein distance to the ideal bootstrap distribution. This allows us to provide an accurate estimation of uncertainty with a small number of bootstrap centroids, outperforming the naive i.i.d. sampling approach. Empirically, we show that our method can boost the performance of bootstrap in a variety of applications.} }
Endnote
%0 Conference Paper %T Centroid Approximation for Bootstrap: Improving Particle Quality at Inference %A Mao Ye %A Qiang Liu %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-ye22a %I PMLR %P 25469--25489 %U https://proceedings.mlr.press/v162/ye22a.html %V 162 %X Bootstrap is a principled and powerful frequentist statistical tool for uncertainty quantification. Unfortunately, standard bootstrap methods are computationally intensive due to the need of drawing a large i.i.d. bootstrap sample to approximate the ideal bootstrap distribution; this largely hinders their application in large-scale machine learning, especially deep learning problems. In this work, we propose an efficient method to explicitly optimize a small set of high quality “centroid” points to better approximate the ideal bootstrap distribution. We achieve this by minimizing a simple objective function that is asymptotically equivalent to the Wasserstein distance to the ideal bootstrap distribution. This allows us to provide an accurate estimation of uncertainty with a small number of bootstrap centroids, outperforming the naive i.i.d. sampling approach. Empirically, we show that our method can boost the performance of bootstrap in a variety of applications.
APA
Ye, M. & Liu, Q.. (2022). Centroid Approximation for Bootstrap: Improving Particle Quality at Inference. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:25469-25489 Available from https://proceedings.mlr.press/v162/ye22a.html.

Related Material