Projection Free Online Learning over Smooth Sets

Kfir Levy, Andreas Krause
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:1458-1466, 2019.

Abstract

The projection operation is a crucial step in applying Online Gradient Descent (OGD) and its stochastic version SGD. Unfortunately, in some cases, projection is computationally demanding and inhibits us from applying OGD. In this work we focus on the special case where the constraint set is smooth and we have an access to gradient and value oracles of the constraint function. Under these assumptions we design a new approximate projection operation that necessitates only logarithmically many calls to these oracles. We further show that combining OGD with this new approximate projection, results in a projection free variant which recovers the standard rates of the fully projected version. This applies to both convex and strongly-convex online settings.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-levy19a, title = {Projection Free Online Learning over Smooth Sets}, author = {Levy, Kfir and Krause, Andreas}, booktitle = {Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics}, pages = {1458--1466}, year = {2019}, editor = {Chaudhuri, Kamalika and Sugiyama, Masashi}, volume = {89}, series = {Proceedings of Machine Learning Research}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/levy19a/levy19a.pdf}, url = {https://proceedings.mlr.press/v89/levy19a.html}, abstract = {The projection operation is a crucial step in applying Online Gradient Descent (OGD) and its stochastic version SGD. Unfortunately, in some cases, projection is computationally demanding and inhibits us from applying OGD. In this work we focus on the special case where the constraint set is smooth and we have an access to gradient and value oracles of the constraint function. Under these assumptions we design a new approximate projection operation that necessitates only logarithmically many calls to these oracles. We further show that combining OGD with this new approximate projection, results in a projection free variant which recovers the standard rates of the fully projected version. This applies to both convex and strongly-convex online settings.} }
Endnote
%0 Conference Paper %T Projection Free Online Learning over Smooth Sets %A Kfir Levy %A Andreas Krause %B Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-levy19a %I PMLR %P 1458--1466 %U https://proceedings.mlr.press/v89/levy19a.html %V 89 %X The projection operation is a crucial step in applying Online Gradient Descent (OGD) and its stochastic version SGD. Unfortunately, in some cases, projection is computationally demanding and inhibits us from applying OGD. In this work we focus on the special case where the constraint set is smooth and we have an access to gradient and value oracles of the constraint function. Under these assumptions we design a new approximate projection operation that necessitates only logarithmically many calls to these oracles. We further show that combining OGD with this new approximate projection, results in a projection free variant which recovers the standard rates of the fully projected version. This applies to both convex and strongly-convex online settings.
APA
Levy, K. & Krause, A.. (2019). Projection Free Online Learning over Smooth Sets. Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 89:1458-1466 Available from https://proceedings.mlr.press/v89/levy19a.html.

Related Material