Orthogonal Bootstrap: Efficient Simulation of Input Uncertainty

Kaizhao Liu, Jose Blanchet, Lexing Ying, Yiping Lu
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:30669-30701, 2024.

Abstract

Bootstrap is a popular methodology for simulating input uncertainty. However, it can be computationally expensive when the number of samples is large. We propose a new approach called Orthogonal Bootstrap that reduces the number of required Monte Carlo replications. We decomposes the target being simulated into two parts: the non-orthogonal part which has a closed-form result known as Infinitesimal Jackknife and the orthogonal part which is easier to be simulated. We theoretically and numerically show that Orthogonal Bootstrap significantly reduces the computational cost of Bootstrap while improving empirical accuracy and maintaining the same width of the constructed interval.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-liu24c, title = {Orthogonal Bootstrap: Efficient Simulation of Input Uncertainty}, author = {Liu, Kaizhao and Blanchet, Jose and Ying, Lexing and Lu, Yiping}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {30669--30701}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/liu24c/liu24c.pdf}, url = {https://proceedings.mlr.press/v235/liu24c.html}, abstract = {Bootstrap is a popular methodology for simulating input uncertainty. However, it can be computationally expensive when the number of samples is large. We propose a new approach called Orthogonal Bootstrap that reduces the number of required Monte Carlo replications. We decomposes the target being simulated into two parts: the non-orthogonal part which has a closed-form result known as Infinitesimal Jackknife and the orthogonal part which is easier to be simulated. We theoretically and numerically show that Orthogonal Bootstrap significantly reduces the computational cost of Bootstrap while improving empirical accuracy and maintaining the same width of the constructed interval.} }
Endnote
%0 Conference Paper %T Orthogonal Bootstrap: Efficient Simulation of Input Uncertainty %A Kaizhao Liu %A Jose Blanchet %A Lexing Ying %A Yiping Lu %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-liu24c %I PMLR %P 30669--30701 %U https://proceedings.mlr.press/v235/liu24c.html %V 235 %X Bootstrap is a popular methodology for simulating input uncertainty. However, it can be computationally expensive when the number of samples is large. We propose a new approach called Orthogonal Bootstrap that reduces the number of required Monte Carlo replications. We decomposes the target being simulated into two parts: the non-orthogonal part which has a closed-form result known as Infinitesimal Jackknife and the orthogonal part which is easier to be simulated. We theoretically and numerically show that Orthogonal Bootstrap significantly reduces the computational cost of Bootstrap while improving empirical accuracy and maintaining the same width of the constructed interval.
APA
Liu, K., Blanchet, J., Ying, L. & Lu, Y.. (2024). Orthogonal Bootstrap: Efficient Simulation of Input Uncertainty. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:30669-30701 Available from https://proceedings.mlr.press/v235/liu24c.html.

Related Material