A Single-Timescale Method for Stochastic Bilevel Optimization

Tianyi Chen, Yuejiao Sun, Quan Xiao, Wotao Yin
Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, PMLR 151:2466-2488, 2022.

Abstract

Stochastic bilevel optimization generalizes the classic stochastic optimization from the minimization of a single objective to the minimization of an objective function that depends on the solution of another optimization problem. Recently, bilevel optimization is regaining popularity in emerging machine learning applications such as hyper-parameter optimization and model-agnostic meta learning. To solve this class of optimization problems, existing methods require either double-loop or two-timescale updates, which are sometimes less efficient. This paper develops a new optimization method for a class of stochastic bilevel problems that we term Single-Timescale stochAstic BiLevEl optimization (STABLE) method. STABLE runs in a single loop fashion, and uses a single-timescale update with a fixed batch size. To achieve an $\epsilon$-stationary point of the bilevel problem, STABLE requires ${\cal O}(\epsilon^{-2})$ samples in total; and to achieve an $\epsilon$-optimal solution in the strongly convex case, STABLE requires ${\cal O}(\epsilon^{-1})$ samples. To the best of our knowledge, when STABLE was proposed, it is the first bilevel optimization algorithm achieving the same order of sample complexity as SGD for single-level stochastic optimization.

Cite this Paper


BibTeX
@InProceedings{pmlr-v151-chen22e, title = { A Single-Timescale Method for Stochastic Bilevel Optimization }, author = {Chen, Tianyi and Sun, Yuejiao and Xiao, Quan and Yin, Wotao}, booktitle = {Proceedings of The 25th International Conference on Artificial Intelligence and Statistics}, pages = {2466--2488}, year = {2022}, editor = {Camps-Valls, Gustau and Ruiz, Francisco J. R. and Valera, Isabel}, volume = {151}, series = {Proceedings of Machine Learning Research}, month = {28--30 Mar}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v151/chen22e/chen22e.pdf}, url = {https://proceedings.mlr.press/v151/chen22e.html}, abstract = { Stochastic bilevel optimization generalizes the classic stochastic optimization from the minimization of a single objective to the minimization of an objective function that depends on the solution of another optimization problem. Recently, bilevel optimization is regaining popularity in emerging machine learning applications such as hyper-parameter optimization and model-agnostic meta learning. To solve this class of optimization problems, existing methods require either double-loop or two-timescale updates, which are sometimes less efficient. This paper develops a new optimization method for a class of stochastic bilevel problems that we term Single-Timescale stochAstic BiLevEl optimization (STABLE) method. STABLE runs in a single loop fashion, and uses a single-timescale update with a fixed batch size. To achieve an $\epsilon$-stationary point of the bilevel problem, STABLE requires ${\cal O}(\epsilon^{-2})$ samples in total; and to achieve an $\epsilon$-optimal solution in the strongly convex case, STABLE requires ${\cal O}(\epsilon^{-1})$ samples. To the best of our knowledge, when STABLE was proposed, it is the first bilevel optimization algorithm achieving the same order of sample complexity as SGD for single-level stochastic optimization. } }
Endnote
%0 Conference Paper %T A Single-Timescale Method for Stochastic Bilevel Optimization %A Tianyi Chen %A Yuejiao Sun %A Quan Xiao %A Wotao Yin %B Proceedings of The 25th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2022 %E Gustau Camps-Valls %E Francisco J. R. Ruiz %E Isabel Valera %F pmlr-v151-chen22e %I PMLR %P 2466--2488 %U https://proceedings.mlr.press/v151/chen22e.html %V 151 %X Stochastic bilevel optimization generalizes the classic stochastic optimization from the minimization of a single objective to the minimization of an objective function that depends on the solution of another optimization problem. Recently, bilevel optimization is regaining popularity in emerging machine learning applications such as hyper-parameter optimization and model-agnostic meta learning. To solve this class of optimization problems, existing methods require either double-loop or two-timescale updates, which are sometimes less efficient. This paper develops a new optimization method for a class of stochastic bilevel problems that we term Single-Timescale stochAstic BiLevEl optimization (STABLE) method. STABLE runs in a single loop fashion, and uses a single-timescale update with a fixed batch size. To achieve an $\epsilon$-stationary point of the bilevel problem, STABLE requires ${\cal O}(\epsilon^{-2})$ samples in total; and to achieve an $\epsilon$-optimal solution in the strongly convex case, STABLE requires ${\cal O}(\epsilon^{-1})$ samples. To the best of our knowledge, when STABLE was proposed, it is the first bilevel optimization algorithm achieving the same order of sample complexity as SGD for single-level stochastic optimization.
APA
Chen, T., Sun, Y., Xiao, Q. & Yin, W.. (2022). A Single-Timescale Method for Stochastic Bilevel Optimization . Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 151:2466-2488 Available from https://proceedings.mlr.press/v151/chen22e.html.

Related Material