[edit]
New Sample Complexity Bounds for Sample Average Approximation in Heavy-Tailed Stochastic Programming
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:31933-31948, 2024.
Abstract
This paper studies sample average approximation (SAA) and its simple regularized variation in solving convex or strongly convex stochastic programming problems. Under heavy-tailed assumptions and comparable regularity conditions as in the typical SAA literature, we show — perhaps for the first time — that the sample complexity can be completely free from any complexity measure (e.g., logarithm of the covering number) of the feasible region. As a result, our new bounds can be more advantageous than the state-of-the-art in terms of the dependence on the problem dimensionality.