Nested Expectations with Kernel Quadrature

Zonghao Chen, Masha Naslidnyk, Francois-Xavier Briol
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:8760-8793, 2025.

Abstract

This paper considers the challenging computational task of estimating nested expectations. Existing algorithms, such as nested Monte Carlo or multilevel Monte Carlo, are known to be consistent but require a large number of samples at both inner and outer levels to converge. Instead, we propose a novel estimator consisting of nested kernel quadrature estimators and we prove that it has a faster convergence rate than all baseline methods when the integrands have sufficient smoothness. We then demonstrate empirically that our proposed method does indeed require the fewest number of samples to estimate nested expectations over a range of real-world application areas from Bayesian optimisation to option pricing and health economics.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-chen25av, title = {Nested Expectations with Kernel Quadrature}, author = {Chen, Zonghao and Naslidnyk, Masha and Briol, Francois-Xavier}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {8760--8793}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/chen25av/chen25av.pdf}, url = {https://proceedings.mlr.press/v267/chen25av.html}, abstract = {This paper considers the challenging computational task of estimating nested expectations. Existing algorithms, such as nested Monte Carlo or multilevel Monte Carlo, are known to be consistent but require a large number of samples at both inner and outer levels to converge. Instead, we propose a novel estimator consisting of nested kernel quadrature estimators and we prove that it has a faster convergence rate than all baseline methods when the integrands have sufficient smoothness. We then demonstrate empirically that our proposed method does indeed require the fewest number of samples to estimate nested expectations over a range of real-world application areas from Bayesian optimisation to option pricing and health economics.} }
Endnote
%0 Conference Paper %T Nested Expectations with Kernel Quadrature %A Zonghao Chen %A Masha Naslidnyk %A Francois-Xavier Briol %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-chen25av %I PMLR %P 8760--8793 %U https://proceedings.mlr.press/v267/chen25av.html %V 267 %X This paper considers the challenging computational task of estimating nested expectations. Existing algorithms, such as nested Monte Carlo or multilevel Monte Carlo, are known to be consistent but require a large number of samples at both inner and outer levels to converge. Instead, we propose a novel estimator consisting of nested kernel quadrature estimators and we prove that it has a faster convergence rate than all baseline methods when the integrands have sufficient smoothness. We then demonstrate empirically that our proposed method does indeed require the fewest number of samples to estimate nested expectations over a range of real-world application areas from Bayesian optimisation to option pricing and health economics.
APA
Chen, Z., Naslidnyk, M. & Briol, F.. (2025). Nested Expectations with Kernel Quadrature. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:8760-8793 Available from https://proceedings.mlr.press/v267/chen25av.html.

Related Material