[edit]
Projection Free Online Learning over Smooth Sets
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:1458-1466, 2019.
Abstract
The projection operation is a crucial step in applying Online Gradient Descent (OGD) and its stochastic version SGD. Unfortunately, in some cases, projection is computationally demanding and inhibits us from applying OGD. In this work we focus on the special case where the constraint set is smooth and we have an access to gradient and value oracles of the constraint function. Under these assumptions we design a new approximate projection operation that necessitates only logarithmically many calls to these oracles. We further show that combining OGD with this new approximate projection, results in a projection free variant which recovers the standard rates of the fully projected version. This applies to both convex and strongly-convex online settings.