The Sample Complexity of Optimizing a Convex Function
[edit]
Proceedings of the 2017 Conference on Learning Theory, PMLR 65:275301, 2017.
Abstract
In this paper we study optimization from samples of convex functions. There are many scenarios in which we do not know the function we wish to optimize but can learn it from data. In such cases, we are interested in bounding the number of samples required to optimize the function. Our main result shows that in general, the number of samples required to obtain a nontrivial approximation to the optimum of a convex function is exponential in its dimension, even when the function is PAClearnable. We also obtain strong lower bounds for strongly convex and Lipschitz continuous functions. On the positive side, we show that there are interesting classes of functions and distributions for which the sample complexity is polynomial in the dimension of the function.
Related Material


