Natural Evolutionary Search meets Probabilistic Numerics

Pierre Osselin, Masaki Adachi, Xiaowen Dong, Michael A Osborne
Proceedings of the First International Conference on Probabilistic Numerics, PMLR 271:50-74, 2025.

Abstract

Zeroth-order local optimisation algorithms are essential for solving real-valued black-box optimisation problems. Among these, Natural Evolution Strategies (NES) represent a prominent class, particularly well-suited for scenarios where prior distributions are available. By optimising the objective function in the space of search distributions, NES algorithms naturally integrate prior knowledge during initialisation, making them effective in settings such as semi-supervised learning and user-prior belief frameworks. However, due to their reliance on random sampling and Monte Carlo estimates, NES algorithms can suffer from limited sample efficiency. In this paper, we introduce a novel class of algorithms, termed Probabilistic Natural Evolutionary Strategy Algorithms (ProbNES), which enhance the NES framework with Bayesian quadrature. We show that ProbNES algorithms consistently outperforms their non-probabilistic counterparts as well as global sample efficient methods such as Bayesian Optimisation (BO) or $\pi$BO across a wide range of tasks, including benchmark test functions, data-driven optimisation tasks, user-informed hyperparameter tuning tasks and locomotion tasks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v271-osselin25a, title = {Natural Evolutionary Search meets Probabilistic Numerics}, author = {Osselin, Pierre and Adachi, Masaki and Dong, Xiaowen and Osborne, Michael A}, booktitle = {Proceedings of the First International Conference on Probabilistic Numerics}, pages = {50--74}, year = {2025}, editor = {Kanagawa, Motonobu and Cockayne, Jon and Gessner, Alexandra and Hennig, Philipp}, volume = {271}, series = {Proceedings of Machine Learning Research}, month = {01--03 Sep}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v271/main/assets/osselin25a/osselin25a.pdf}, url = {https://proceedings.mlr.press/v271/osselin25a.html}, abstract = {Zeroth-order local optimisation algorithms are essential for solving real-valued black-box optimisation problems. Among these, Natural Evolution Strategies (NES) represent a prominent class, particularly well-suited for scenarios where prior distributions are available. By optimising the objective function in the space of search distributions, NES algorithms naturally integrate prior knowledge during initialisation, making them effective in settings such as semi-supervised learning and user-prior belief frameworks. However, due to their reliance on random sampling and Monte Carlo estimates, NES algorithms can suffer from limited sample efficiency. In this paper, we introduce a novel class of algorithms, termed Probabilistic Natural Evolutionary Strategy Algorithms (ProbNES), which enhance the NES framework with Bayesian quadrature. We show that ProbNES algorithms consistently outperforms their non-probabilistic counterparts as well as global sample efficient methods such as Bayesian Optimisation (BO) or $\pi$BO across a wide range of tasks, including benchmark test functions, data-driven optimisation tasks, user-informed hyperparameter tuning tasks and locomotion tasks.} }
Endnote
%0 Conference Paper %T Natural Evolutionary Search meets Probabilistic Numerics %A Pierre Osselin %A Masaki Adachi %A Xiaowen Dong %A Michael A Osborne %B Proceedings of the First International Conference on Probabilistic Numerics %C Proceedings of Machine Learning Research %D 2025 %E Motonobu Kanagawa %E Jon Cockayne %E Alexandra Gessner %E Philipp Hennig %F pmlr-v271-osselin25a %I PMLR %P 50--74 %U https://proceedings.mlr.press/v271/osselin25a.html %V 271 %X Zeroth-order local optimisation algorithms are essential for solving real-valued black-box optimisation problems. Among these, Natural Evolution Strategies (NES) represent a prominent class, particularly well-suited for scenarios where prior distributions are available. By optimising the objective function in the space of search distributions, NES algorithms naturally integrate prior knowledge during initialisation, making them effective in settings such as semi-supervised learning and user-prior belief frameworks. However, due to their reliance on random sampling and Monte Carlo estimates, NES algorithms can suffer from limited sample efficiency. In this paper, we introduce a novel class of algorithms, termed Probabilistic Natural Evolutionary Strategy Algorithms (ProbNES), which enhance the NES framework with Bayesian quadrature. We show that ProbNES algorithms consistently outperforms their non-probabilistic counterparts as well as global sample efficient methods such as Bayesian Optimisation (BO) or $\pi$BO across a wide range of tasks, including benchmark test functions, data-driven optimisation tasks, user-informed hyperparameter tuning tasks and locomotion tasks.
APA
Osselin, P., Adachi, M., Dong, X. & Osborne, M.A.. (2025). Natural Evolutionary Search meets Probabilistic Numerics. Proceedings of the First International Conference on Probabilistic Numerics, in Proceedings of Machine Learning Research 271:50-74 Available from https://proceedings.mlr.press/v271/osselin25a.html.

Related Material