Closing the Computational-Query Depth Gap in Parallel Stochastic Convex Optimization

Arun Jambulapati, Aaron Sidford, Kevin Tian
Proceedings of Thirty Seventh Conference on Learning Theory, PMLR 247:2608-2643, 2024.

Abstract

We develop a new parallel algorithm for minimizing Lipschitz, convex functions with a stochastic subgradient oracle. The total number of queries made and the query depth, i.e., the number of parallel rounds of queries, match the prior state-of-the-art, [CJJLLST23], while improving upon the computational depth by a polynomial factor for sufficiently small accuracy. When combined with previous state-of-the-art methods our result closes a gap between the best-known query depth and the best-known computational depth of parallel algorithms. Our method starts with a \emph{ball acceleration} framework of previous parallel methods, i.e., [CJJJLST20, ACJJS21], which reduce the problem to minimizing a regularized Gaussian convolution of the function constrained to Euclidean balls. By developing and leveraging new stability properties of the Hessian of this induced function, we depart from prior parallel algorithms and reduce these ball-constrained optimization problems to stochastic unconstrained quadratic minimization problems. Although we are unable to prove concentration of the asymmetric matrices that we use to approximate this Hessian, we nevertheless develop an efficient parallel method for solving these quadratics. Interestingly, our algorithms can be improved using fast matrix multiplication and run in nearly-linear time if the matrix multiplication exponent is 2.

Cite this Paper


BibTeX
@InProceedings{pmlr-v247-jambulapati24b, title = {Closing the Computational-Query Depth Gap in Parallel Stochastic Convex Optimization}, author = {Jambulapati, Arun and Sidford, Aaron and Tian, Kevin}, booktitle = {Proceedings of Thirty Seventh Conference on Learning Theory}, pages = {2608--2643}, year = {2024}, editor = {Agrawal, Shipra and Roth, Aaron}, volume = {247}, series = {Proceedings of Machine Learning Research}, month = {30 Jun--03 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v247/jambulapati24b/jambulapati24b.pdf}, url = {https://proceedings.mlr.press/v247/jambulapati24b.html}, abstract = {We develop a new parallel algorithm for minimizing Lipschitz, convex functions with a stochastic subgradient oracle. The total number of queries made and the query depth, i.e., the number of parallel rounds of queries, match the prior state-of-the-art, [CJJLLST23], while improving upon the computational depth by a polynomial factor for sufficiently small accuracy. When combined with previous state-of-the-art methods our result closes a gap between the best-known query depth and the best-known computational depth of parallel algorithms. Our method starts with a \emph{ball acceleration} framework of previous parallel methods, i.e., [CJJJLST20, ACJJS21], which reduce the problem to minimizing a regularized Gaussian convolution of the function constrained to Euclidean balls. By developing and leveraging new stability properties of the Hessian of this induced function, we depart from prior parallel algorithms and reduce these ball-constrained optimization problems to stochastic unconstrained quadratic minimization problems. Although we are unable to prove concentration of the asymmetric matrices that we use to approximate this Hessian, we nevertheless develop an efficient parallel method for solving these quadratics. Interestingly, our algorithms can be improved using fast matrix multiplication and run in nearly-linear time if the matrix multiplication exponent is 2.} }
Endnote
%0 Conference Paper %T Closing the Computational-Query Depth Gap in Parallel Stochastic Convex Optimization %A Arun Jambulapati %A Aaron Sidford %A Kevin Tian %B Proceedings of Thirty Seventh Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2024 %E Shipra Agrawal %E Aaron Roth %F pmlr-v247-jambulapati24b %I PMLR %P 2608--2643 %U https://proceedings.mlr.press/v247/jambulapati24b.html %V 247 %X We develop a new parallel algorithm for minimizing Lipschitz, convex functions with a stochastic subgradient oracle. The total number of queries made and the query depth, i.e., the number of parallel rounds of queries, match the prior state-of-the-art, [CJJLLST23], while improving upon the computational depth by a polynomial factor for sufficiently small accuracy. When combined with previous state-of-the-art methods our result closes a gap between the best-known query depth and the best-known computational depth of parallel algorithms. Our method starts with a \emph{ball acceleration} framework of previous parallel methods, i.e., [CJJJLST20, ACJJS21], which reduce the problem to minimizing a regularized Gaussian convolution of the function constrained to Euclidean balls. By developing and leveraging new stability properties of the Hessian of this induced function, we depart from prior parallel algorithms and reduce these ball-constrained optimization problems to stochastic unconstrained quadratic minimization problems. Although we are unable to prove concentration of the asymmetric matrices that we use to approximate this Hessian, we nevertheless develop an efficient parallel method for solving these quadratics. Interestingly, our algorithms can be improved using fast matrix multiplication and run in nearly-linear time if the matrix multiplication exponent is 2.
APA
Jambulapati, A., Sidford, A. & Tian, K.. (2024). Closing the Computational-Query Depth Gap in Parallel Stochastic Convex Optimization. Proceedings of Thirty Seventh Conference on Learning Theory, in Proceedings of Machine Learning Research 247:2608-2643 Available from https://proceedings.mlr.press/v247/jambulapati24b.html.

Related Material