Practical Nonisotropic Monte Carlo Sampling in High Dimensions via Determinantal Point Processes

Krzysztof Choromanski, Aldo Pacchiano, Jack Parker-Holder, Yunhao Tang
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:1363-1374, 2020.

Abstract

We propose a new class of practical structured methods for nonisotropic Monte Carlo (MC) sampling, called DPPMC, designed for high-dimensional nonisotropic distributions where samples are correlated to reduce the variance of the estimator via determinantal point processes. We successfully apply DPPMCs to high-dimensional problems involving nonisotropic distributions arising in guided evolution strategy (GES) methods for reinforcement learning (RL), CMA-ES techniques and trust region algorithms for blackbox optimization, improving state-of-the-art in all these settings. In particular, we show that DPPMCs drastically improve exploration profiles of the existing evolution strategy algorithms. We further confirm our results, analyzing random feature map estimators for Gaussian mixture kernels. We provide theoretical justification of our empirical results, showing a connection between DPPMCs and recently introduced structured orthogonal MC methods for isotropic distributions.

Cite this Paper


BibTeX
@InProceedings{pmlr-v108-choromanski20a, title = {Practical Nonisotropic Monte Carlo Sampling in High Dimensions via Determinantal Point Processes}, author = {Choromanski, Krzysztof and Pacchiano, Aldo and Parker-Holder, Jack and Tang, Yunhao}, booktitle = {Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics}, pages = {1363--1374}, year = {2020}, editor = {Chiappa, Silvia and Calandra, Roberto}, volume = {108}, series = {Proceedings of Machine Learning Research}, month = {26--28 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v108/choromanski20a/choromanski20a.pdf}, url = {https://proceedings.mlr.press/v108/choromanski20a.html}, abstract = {We propose a new class of practical structured methods for nonisotropic Monte Carlo (MC) sampling, called DPPMC, designed for high-dimensional nonisotropic distributions where samples are correlated to reduce the variance of the estimator via determinantal point processes. We successfully apply DPPMCs to high-dimensional problems involving nonisotropic distributions arising in guided evolution strategy (GES) methods for reinforcement learning (RL), CMA-ES techniques and trust region algorithms for blackbox optimization, improving state-of-the-art in all these settings. In particular, we show that DPPMCs drastically improve exploration profiles of the existing evolution strategy algorithms. We further confirm our results, analyzing random feature map estimators for Gaussian mixture kernels. We provide theoretical justification of our empirical results, showing a connection between DPPMCs and recently introduced structured orthogonal MC methods for isotropic distributions.} }
Endnote
%0 Conference Paper %T Practical Nonisotropic Monte Carlo Sampling in High Dimensions via Determinantal Point Processes %A Krzysztof Choromanski %A Aldo Pacchiano %A Jack Parker-Holder %A Yunhao Tang %B Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2020 %E Silvia Chiappa %E Roberto Calandra %F pmlr-v108-choromanski20a %I PMLR %P 1363--1374 %U https://proceedings.mlr.press/v108/choromanski20a.html %V 108 %X We propose a new class of practical structured methods for nonisotropic Monte Carlo (MC) sampling, called DPPMC, designed for high-dimensional nonisotropic distributions where samples are correlated to reduce the variance of the estimator via determinantal point processes. We successfully apply DPPMCs to high-dimensional problems involving nonisotropic distributions arising in guided evolution strategy (GES) methods for reinforcement learning (RL), CMA-ES techniques and trust region algorithms for blackbox optimization, improving state-of-the-art in all these settings. In particular, we show that DPPMCs drastically improve exploration profiles of the existing evolution strategy algorithms. We further confirm our results, analyzing random feature map estimators for Gaussian mixture kernels. We provide theoretical justification of our empirical results, showing a connection between DPPMCs and recently introduced structured orthogonal MC methods for isotropic distributions.
APA
Choromanski, K., Pacchiano, A., Parker-Holder, J. & Tang, Y.. (2020). Practical Nonisotropic Monte Carlo Sampling in High Dimensions via Determinantal Point Processes. Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 108:1363-1374 Available from https://proceedings.mlr.press/v108/choromanski20a.html.

Related Material