Guided evolutionary strategies: augmenting random search with surrogate gradients

Niru Maheswaranathan, Luke Metz, George Tucker, Dami Choi, Jascha Sohl-Dickstein
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:4264-4273, 2019.

Abstract

Many applications in machine learning require optimizing a function whose true gradient is unknown or computationally expensive, but where surrogate gradient information, directions that may be correlated with the true gradient, is cheaply available. For example, this occurs when an approximate gradient is easier to compute than the full gradient (e.g. in meta-learning or unrolled optimization), or when a true gradient is intractable and is replaced with a surrogate (e.g. in reinforcement learning or training networks with discrete variables). We propose Guided Evolutionary Strategies (GES), a method for optimally using surrogate gradient directions to accelerate random search. GES defines a search distribution for evolutionary strategies that is elongated along a subspace spanned by the surrogate gradients and estimates a descent direction which can then be passed to a first-order optimizer. We analytically and numerically characterize the tradeoffs that result from tuning how strongly the search distribution is stretched along the guiding subspace and use this to derive a setting of the hyperparameters that works well across problems. We evaluate GES on several example problems, demonstrating an improvement over both standard evolutionary strategies and first-order methods that directly follow the surrogate gradient.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-maheswaranathan19a, title = {Guided evolutionary strategies: augmenting random search with surrogate gradients}, author = {Maheswaranathan, Niru and Metz, Luke and Tucker, George and Choi, Dami and Sohl-Dickstein, Jascha}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {4264--4273}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/maheswaranathan19a/maheswaranathan19a.pdf}, url = {https://proceedings.mlr.press/v97/maheswaranathan19a.html}, abstract = {Many applications in machine learning require optimizing a function whose true gradient is unknown or computationally expensive, but where surrogate gradient information, directions that may be correlated with the true gradient, is cheaply available. For example, this occurs when an approximate gradient is easier to compute than the full gradient (e.g. in meta-learning or unrolled optimization), or when a true gradient is intractable and is replaced with a surrogate (e.g. in reinforcement learning or training networks with discrete variables). We propose Guided Evolutionary Strategies (GES), a method for optimally using surrogate gradient directions to accelerate random search. GES defines a search distribution for evolutionary strategies that is elongated along a subspace spanned by the surrogate gradients and estimates a descent direction which can then be passed to a first-order optimizer. We analytically and numerically characterize the tradeoffs that result from tuning how strongly the search distribution is stretched along the guiding subspace and use this to derive a setting of the hyperparameters that works well across problems. We evaluate GES on several example problems, demonstrating an improvement over both standard evolutionary strategies and first-order methods that directly follow the surrogate gradient.} }
Endnote
%0 Conference Paper %T Guided evolutionary strategies: augmenting random search with surrogate gradients %A Niru Maheswaranathan %A Luke Metz %A George Tucker %A Dami Choi %A Jascha Sohl-Dickstein %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-maheswaranathan19a %I PMLR %P 4264--4273 %U https://proceedings.mlr.press/v97/maheswaranathan19a.html %V 97 %X Many applications in machine learning require optimizing a function whose true gradient is unknown or computationally expensive, but where surrogate gradient information, directions that may be correlated with the true gradient, is cheaply available. For example, this occurs when an approximate gradient is easier to compute than the full gradient (e.g. in meta-learning or unrolled optimization), or when a true gradient is intractable and is replaced with a surrogate (e.g. in reinforcement learning or training networks with discrete variables). We propose Guided Evolutionary Strategies (GES), a method for optimally using surrogate gradient directions to accelerate random search. GES defines a search distribution for evolutionary strategies that is elongated along a subspace spanned by the surrogate gradients and estimates a descent direction which can then be passed to a first-order optimizer. We analytically and numerically characterize the tradeoffs that result from tuning how strongly the search distribution is stretched along the guiding subspace and use this to derive a setting of the hyperparameters that works well across problems. We evaluate GES on several example problems, demonstrating an improvement over both standard evolutionary strategies and first-order methods that directly follow the surrogate gradient.
APA
Maheswaranathan, N., Metz, L., Tucker, G., Choi, D. & Sohl-Dickstein, J.. (2019). Guided evolutionary strategies: augmenting random search with surrogate gradients. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:4264-4273 Available from https://proceedings.mlr.press/v97/maheswaranathan19a.html.

Related Material