[edit]
Open Problem: The Oracle Complexity of Smooth Convex Optimization in Nonstandard Settings
Proceedings of The 28th Conference on Learning Theory, PMLR 40:1761-1763, 2015.
Abstract
First-order convex minimization algorithms are currently the methods of choice for large-scale sparse – and more generally parsimonious – regression models. We pose the question on the limits of performance of black-box oriented methods for convex minimization in \em non-standard settings, where the regularity of the objective is measured in a norm not necessarily induced by the feasible domain. This question is studied for \ell_p/\ell_q-settings, and their matrix analogues (Schatten norms), where we find surprising gaps on lower bounds compared to state of the art methods. We propose a conjecture on the optimal convergence rates for these settings, for which a positive answer would lead to significant improvements on minimization algorithms for parsimonious regression models.