A Universal Transfer Theorem for Convex Optimization Algorithms Using Inexact First-order Oracles

Phillip Kerger, Marco Molinaro, Hongyi Jiang, Amitabh Basu
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:23532-23546, 2024.

Abstract

Given any algorithm for convex optimization that uses exact first-order information (i.e., function values and subgradients), we show how to use such an algorithm to solve the problem with access to inexact first-order information. This is done in a “black-box” manner without knowledge of the internal workings of the algorithm. This complements previous work that considers the performance of specific algorithms like (accelerated) gradient descent with inexact information. In particular, our results apply to a wider range of algorithms beyond variants of gradient descent, e.g., projection-free methods, cutting-plane methods, or any other first-order methods formulated in the future. Further, they also apply to algorithms that handle structured nonconvexities like mixed-integer decision variables.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-kerger24a, title = {A Universal Transfer Theorem for Convex Optimization Algorithms Using Inexact First-order Oracles}, author = {Kerger, Phillip and Molinaro, Marco and Jiang, Hongyi and Basu, Amitabh}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {23532--23546}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/kerger24a/kerger24a.pdf}, url = {https://proceedings.mlr.press/v235/kerger24a.html}, abstract = {Given any algorithm for convex optimization that uses exact first-order information (i.e., function values and subgradients), we show how to use such an algorithm to solve the problem with access to inexact first-order information. This is done in a “black-box” manner without knowledge of the internal workings of the algorithm. This complements previous work that considers the performance of specific algorithms like (accelerated) gradient descent with inexact information. In particular, our results apply to a wider range of algorithms beyond variants of gradient descent, e.g., projection-free methods, cutting-plane methods, or any other first-order methods formulated in the future. Further, they also apply to algorithms that handle structured nonconvexities like mixed-integer decision variables.} }
Endnote
%0 Conference Paper %T A Universal Transfer Theorem for Convex Optimization Algorithms Using Inexact First-order Oracles %A Phillip Kerger %A Marco Molinaro %A Hongyi Jiang %A Amitabh Basu %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-kerger24a %I PMLR %P 23532--23546 %U https://proceedings.mlr.press/v235/kerger24a.html %V 235 %X Given any algorithm for convex optimization that uses exact first-order information (i.e., function values and subgradients), we show how to use such an algorithm to solve the problem with access to inexact first-order information. This is done in a “black-box” manner without knowledge of the internal workings of the algorithm. This complements previous work that considers the performance of specific algorithms like (accelerated) gradient descent with inexact information. In particular, our results apply to a wider range of algorithms beyond variants of gradient descent, e.g., projection-free methods, cutting-plane methods, or any other first-order methods formulated in the future. Further, they also apply to algorithms that handle structured nonconvexities like mixed-integer decision variables.
APA
Kerger, P., Molinaro, M., Jiang, H. & Basu, A.. (2024). A Universal Transfer Theorem for Convex Optimization Algorithms Using Inexact First-order Oracles. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:23532-23546 Available from https://proceedings.mlr.press/v235/kerger24a.html.

Related Material