A Generic First-Order Algorithmic Framework for Bi-Level Programming Beyond Lower-Level Singleton

Risheng Liu, Pan Mu, Xiaoming Yuan, Shangzhi Zeng, Jin Zhang
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:6305-6315, 2020.

Abstract

In recent years, a variety of gradient-based bi-level optimization methods have been developed for learning tasks. However, theoretical guarantees of these existing approaches often heavily rely on the simplification that for each fixed upper-level variable, the lower-level solution must be a singleton (a.k.a., Lower-Level Singleton, LLS). In this work, by formulating bi-level models from the optimistic viewpoint and aggregating hierarchical objective information, we establish Bi-level Descent Aggregation (BDA), a flexible and modularized algorithmic framework for bi-level programming. Theoretically, we derive a new methodology to prove the convergence of BDA without the LLS condition. Furthermore, we improve the convergence properties of conventional first-order bi-level schemes (under the LLS simplification) based on our proof recipe. Extensive experiments justify our theoretical results and demonstrate the superiority of the proposed BDA for different tasks, including hyper-parameter optimization and meta learning.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-liu20l, title = {A Generic First-Order Algorithmic Framework for Bi-Level Programming Beyond Lower-Level Singleton}, author = {Liu, Risheng and Mu, Pan and Yuan, Xiaoming and Zeng, Shangzhi and Zhang, Jin}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {6305--6315}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/liu20l/liu20l.pdf}, url = {https://proceedings.mlr.press/v119/liu20l.html}, abstract = {In recent years, a variety of gradient-based bi-level optimization methods have been developed for learning tasks. However, theoretical guarantees of these existing approaches often heavily rely on the simplification that for each fixed upper-level variable, the lower-level solution must be a singleton (a.k.a., Lower-Level Singleton, LLS). In this work, by formulating bi-level models from the optimistic viewpoint and aggregating hierarchical objective information, we establish Bi-level Descent Aggregation (BDA), a flexible and modularized algorithmic framework for bi-level programming. Theoretically, we derive a new methodology to prove the convergence of BDA without the LLS condition. Furthermore, we improve the convergence properties of conventional first-order bi-level schemes (under the LLS simplification) based on our proof recipe. Extensive experiments justify our theoretical results and demonstrate the superiority of the proposed BDA for different tasks, including hyper-parameter optimization and meta learning.} }
Endnote
%0 Conference Paper %T A Generic First-Order Algorithmic Framework for Bi-Level Programming Beyond Lower-Level Singleton %A Risheng Liu %A Pan Mu %A Xiaoming Yuan %A Shangzhi Zeng %A Jin Zhang %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-liu20l %I PMLR %P 6305--6315 %U https://proceedings.mlr.press/v119/liu20l.html %V 119 %X In recent years, a variety of gradient-based bi-level optimization methods have been developed for learning tasks. However, theoretical guarantees of these existing approaches often heavily rely on the simplification that for each fixed upper-level variable, the lower-level solution must be a singleton (a.k.a., Lower-Level Singleton, LLS). In this work, by formulating bi-level models from the optimistic viewpoint and aggregating hierarchical objective information, we establish Bi-level Descent Aggregation (BDA), a flexible and modularized algorithmic framework for bi-level programming. Theoretically, we derive a new methodology to prove the convergence of BDA without the LLS condition. Furthermore, we improve the convergence properties of conventional first-order bi-level schemes (under the LLS simplification) based on our proof recipe. Extensive experiments justify our theoretical results and demonstrate the superiority of the proposed BDA for different tasks, including hyper-parameter optimization and meta learning.
APA
Liu, R., Mu, P., Yuan, X., Zeng, S. & Zhang, J.. (2020). A Generic First-Order Algorithmic Framework for Bi-Level Programming Beyond Lower-Level Singleton. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:6305-6315 Available from https://proceedings.mlr.press/v119/liu20l.html.

Related Material