[edit]
Efficient Projection-Free Online Convex Optimization with Membership Oracle
Proceedings of Thirty Fifth Conference on Learning Theory, PMLR 178:5314-5390, 2022.
Abstract
In constrained convex optimization, existing interior point methods do not scale well with the dimension of the ambient space. Alternative approaches such as Projected Gradient Descent only provide a computational benefit for simple convex sets where Euclidean projections can be performed efficiently, such as Euclidean balls. For other more complex sets, the cost of the projections can be too high. To circumvent these issues, alternative methods based on the famous Frank-Wolfe algorithm have been studied and widely used. Such methods use a Linear Optimization Oracle at each iteration instead of Euclidean projections; the former can often be performed efficiently. Such methods have also been extended to the online and stochastic optimization settings. However, the Frank-Wolfe algorithm and its variants do not achieve the optimal performance, in terms of regret or rate, for general convex sets. What is more, the Linear Optimization Oracle they use can still be computationally expensive in some cases. In this paper, we move away from Frank-Wolfe style algorithms and present a new reduction that turns any algorithm $\mathsf{A}$ over a Euclidean ball (where projections are cheap) to an algorithm over a general convex constraint set $\mathcal{C}$ contained within the ball, without sacrificing the performance of the original algorithm $\mathsf{A}$ by much. Our reduction requires $O(T \ln T)$ calls to a Membership Oracle on $\mathcal{C}$ after $T$ rounds, and no linear optimization on $\K$ is needed. Using this reduction, we recover optimal regret bounds [resp. rates], in terms of the number of iterations, in online [resp. stochastic] convex optimization. Our guarantees are also useful in the offline convex optimization setting when the dimension of the ambient space is large.