Local Saddle Point Optimization: A Curvature Exploitation Approach

Leonard Adolphs, Hadi Daneshmand, Aurelien Lucchi, Thomas Hofmann
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:486-495, 2019.

Abstract

Gradient-based optimization methods are the most popular choice for finding local optima for classical minimization and saddle point problems. Here, we highlight a systemic issue of gradient dynamics that arise for saddle point problems, namely the presence of undesired stable stationary points that are no local optima. We propose a novel optimization approach that exploits curvature information in order to escape from these undesired stationary points. We prove that different optimization methods, including gradient method and Adagrad, equipped with curvature exploitation can escape non-optimal stationary points. We also provide empirical results on common saddle point problems which confirm the advantage of using curvature exploitation.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-adolphs19a, title = {Local Saddle Point Optimization: A Curvature Exploitation Approach}, author = {Adolphs, Leonard and Daneshmand, Hadi and Lucchi, Aurelien and Hofmann, Thomas}, booktitle = {Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics}, pages = {486--495}, year = {2019}, editor = {Chaudhuri, Kamalika and Sugiyama, Masashi}, volume = {89}, series = {Proceedings of Machine Learning Research}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/adolphs19a/adolphs19a.pdf}, url = {https://proceedings.mlr.press/v89/adolphs19a.html}, abstract = {Gradient-based optimization methods are the most popular choice for finding local optima for classical minimization and saddle point problems. Here, we highlight a systemic issue of gradient dynamics that arise for saddle point problems, namely the presence of undesired stable stationary points that are no local optima. We propose a novel optimization approach that exploits curvature information in order to escape from these undesired stationary points. We prove that different optimization methods, including gradient method and Adagrad, equipped with curvature exploitation can escape non-optimal stationary points. We also provide empirical results on common saddle point problems which confirm the advantage of using curvature exploitation.} }
Endnote
%0 Conference Paper %T Local Saddle Point Optimization: A Curvature Exploitation Approach %A Leonard Adolphs %A Hadi Daneshmand %A Aurelien Lucchi %A Thomas Hofmann %B Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-adolphs19a %I PMLR %P 486--495 %U https://proceedings.mlr.press/v89/adolphs19a.html %V 89 %X Gradient-based optimization methods are the most popular choice for finding local optima for classical minimization and saddle point problems. Here, we highlight a systemic issue of gradient dynamics that arise for saddle point problems, namely the presence of undesired stable stationary points that are no local optima. We propose a novel optimization approach that exploits curvature information in order to escape from these undesired stationary points. We prove that different optimization methods, including gradient method and Adagrad, equipped with curvature exploitation can escape non-optimal stationary points. We also provide empirical results on common saddle point problems which confirm the advantage of using curvature exploitation.
APA
Adolphs, L., Daneshmand, H., Lucchi, A. & Hofmann, T.. (2019). Local Saddle Point Optimization: A Curvature Exploitation Approach. Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 89:486-495 Available from https://proceedings.mlr.press/v89/adolphs19a.html.

Related Material