New Sample Complexity Bounds for Sample Average Approximation in Heavy-Tailed Stochastic Programming

Hongcheng Liu, Jindong Tong
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:31933-31948, 2024.

Abstract

This paper studies sample average approximation (SAA) and its simple regularized variation in solving convex or strongly convex stochastic programming problems. Under heavy-tailed assumptions and comparable regularity conditions as in the typical SAA literature, we show — perhaps for the first time — that the sample complexity can be completely free from any complexity measure (e.g., logarithm of the covering number) of the feasible region. As a result, our new bounds can be more advantageous than the state-of-the-art in terms of the dependence on the problem dimensionality.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-liu24bf, title = {New Sample Complexity Bounds for Sample Average Approximation in Heavy-Tailed Stochastic Programming}, author = {Liu, Hongcheng and Tong, Jindong}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {31933--31948}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/liu24bf/liu24bf.pdf}, url = {https://proceedings.mlr.press/v235/liu24bf.html}, abstract = {This paper studies sample average approximation (SAA) and its simple regularized variation in solving convex or strongly convex stochastic programming problems. Under heavy-tailed assumptions and comparable regularity conditions as in the typical SAA literature, we show — perhaps for the first time — that the sample complexity can be completely free from any complexity measure (e.g., logarithm of the covering number) of the feasible region. As a result, our new bounds can be more advantageous than the state-of-the-art in terms of the dependence on the problem dimensionality.} }
Endnote
%0 Conference Paper %T New Sample Complexity Bounds for Sample Average Approximation in Heavy-Tailed Stochastic Programming %A Hongcheng Liu %A Jindong Tong %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-liu24bf %I PMLR %P 31933--31948 %U https://proceedings.mlr.press/v235/liu24bf.html %V 235 %X This paper studies sample average approximation (SAA) and its simple regularized variation in solving convex or strongly convex stochastic programming problems. Under heavy-tailed assumptions and comparable regularity conditions as in the typical SAA literature, we show — perhaps for the first time — that the sample complexity can be completely free from any complexity measure (e.g., logarithm of the covering number) of the feasible region. As a result, our new bounds can be more advantageous than the state-of-the-art in terms of the dependence on the problem dimensionality.
APA
Liu, H. & Tong, J.. (2024). New Sample Complexity Bounds for Sample Average Approximation in Heavy-Tailed Stochastic Programming. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:31933-31948 Available from https://proceedings.mlr.press/v235/liu24bf.html.

Related Material