Online Bilevel Optimization: Regret Analysis of Online Alternating Gradient Methods

Davoud Ataee Tarzanagh, Parvin Nazari, Bojian Hou, Li Shen, Laura Balzano
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:2854-2862, 2024.

Abstract

This paper introduces \textit{online bilevel optimization} in which a sequence of time-varying bilevel problems is revealed one after the other. We extend the known regret bounds for single-level online algorithms to the bilevel setting. Specifically, we provide new notions of \textit{bilevel regret}, develop an online alternating time-averaged gradient method that is capable of leveraging smoothness, and give regret bounds in terms of the path-length of the inner and outer minimizer sequences.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-ataee-tarzanagh24a, title = {Online Bilevel Optimization: Regret Analysis of Online Alternating Gradient Methods}, author = {Ataee Tarzanagh, Davoud and Nazari, Parvin and Hou, Bojian and Shen, Li and Balzano, Laura}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {2854--2862}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/ataee-tarzanagh24a/ataee-tarzanagh24a.pdf}, url = {https://proceedings.mlr.press/v238/ataee-tarzanagh24a.html}, abstract = {This paper introduces \textit{online bilevel optimization} in which a sequence of time-varying bilevel problems is revealed one after the other. We extend the known regret bounds for single-level online algorithms to the bilevel setting. Specifically, we provide new notions of \textit{bilevel regret}, develop an online alternating time-averaged gradient method that is capable of leveraging smoothness, and give regret bounds in terms of the path-length of the inner and outer minimizer sequences.} }
Endnote
%0 Conference Paper %T Online Bilevel Optimization: Regret Analysis of Online Alternating Gradient Methods %A Davoud Ataee Tarzanagh %A Parvin Nazari %A Bojian Hou %A Li Shen %A Laura Balzano %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-ataee-tarzanagh24a %I PMLR %P 2854--2862 %U https://proceedings.mlr.press/v238/ataee-tarzanagh24a.html %V 238 %X This paper introduces \textit{online bilevel optimization} in which a sequence of time-varying bilevel problems is revealed one after the other. We extend the known regret bounds for single-level online algorithms to the bilevel setting. Specifically, we provide new notions of \textit{bilevel regret}, develop an online alternating time-averaged gradient method that is capable of leveraging smoothness, and give regret bounds in terms of the path-length of the inner and outer minimizer sequences.
APA
Ataee Tarzanagh, D., Nazari, P., Hou, B., Shen, L. & Balzano, L.. (2024). Online Bilevel Optimization: Regret Analysis of Online Alternating Gradient Methods. Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:2854-2862 Available from https://proceedings.mlr.press/v238/ataee-tarzanagh24a.html.

Related Material