Bilevel Optimization for Hyperparameter Learning in Supporting Vector Machines

Lei Huang, Jiawang Nie, Jiajia Wang, Suhan Zhong
Proceedings of the 1st Conference on Topology, Algebra, and Geometry in Data Science(TAG-DS 2025), PMLR 321:45-55, 2026.

Abstract

Bilevel optimization is central to many machine learning tasks, including hyperparameter learning and adversarial training. We present a novel single-level reformulation for bilevel problems with convex lower-level objective functions and linear constraints. Our method eliminates auxiliary Lagrange multiplier variables by expressing them in terms of the original decision variables, which allows the reformulated problem to preserve the same dimension as the original problem. We applied our method to support vector machines (SVMs) and evaluated it on several benchmark tasks, demonstrating efficiency and scalability.

Cite this Paper


BibTeX
@InProceedings{pmlr-v321-huang26a, title = {Bilevel Optimization for Hyperparameter Learning in Supporting Vector Machines}, author = {Huang, Lei and Nie, Jiawang and Wang, Jiajia and Zhong, Suhan}, booktitle = {Proceedings of the 1st Conference on Topology, Algebra, and Geometry in Data Science(TAG-DS 2025)}, pages = {45--55}, year = {2026}, editor = {Bernardez Gil, Guillermo and Black, Mitchell and Cloninger, Alexander and Doster, Timothy and Emerson, Tegan and Garcı́a-Rodondo, Ińes and Holtz, Chester and Kotak, Mit and Kvinge, Henry and Mishne, Gal and Papillon, Mathilde and Pouplin, Alison and Rainey, Katie and Rieck, Bastian and Telyatnikov, Lev and Yeats, Eric and Wang, Qingsong and Wang, Yusu and Wayland, Jeremy}, volume = {321}, series = {Proceedings of Machine Learning Research}, month = {01--02 Dec}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v321/main/assets/huang26a/huang26a.pdf}, url = {https://proceedings.mlr.press/v321/huang26a.html}, abstract = {Bilevel optimization is central to many machine learning tasks, including hyperparameter learning and adversarial training. We present a novel single-level reformulation for bilevel problems with convex lower-level objective functions and linear constraints. Our method eliminates auxiliary Lagrange multiplier variables by expressing them in terms of the original decision variables, which allows the reformulated problem to preserve the same dimension as the original problem. We applied our method to support vector machines (SVMs) and evaluated it on several benchmark tasks, demonstrating efficiency and scalability.} }
Endnote
%0 Conference Paper %T Bilevel Optimization for Hyperparameter Learning in Supporting Vector Machines %A Lei Huang %A Jiawang Nie %A Jiajia Wang %A Suhan Zhong %B Proceedings of the 1st Conference on Topology, Algebra, and Geometry in Data Science(TAG-DS 2025) %C Proceedings of Machine Learning Research %D 2026 %E Guillermo Bernardez Gil %E Mitchell Black %E Alexander Cloninger %E Timothy Doster %E Tegan Emerson %E Ińes Garcı́a-Rodondo %E Chester Holtz %E Mit Kotak %E Henry Kvinge %E Gal Mishne %E Mathilde Papillon %E Alison Pouplin %E Katie Rainey %E Bastian Rieck %E Lev Telyatnikov %E Eric Yeats %E Qingsong Wang %E Yusu Wang %E Jeremy Wayland %F pmlr-v321-huang26a %I PMLR %P 45--55 %U https://proceedings.mlr.press/v321/huang26a.html %V 321 %X Bilevel optimization is central to many machine learning tasks, including hyperparameter learning and adversarial training. We present a novel single-level reformulation for bilevel problems with convex lower-level objective functions and linear constraints. Our method eliminates auxiliary Lagrange multiplier variables by expressing them in terms of the original decision variables, which allows the reformulated problem to preserve the same dimension as the original problem. We applied our method to support vector machines (SVMs) and evaluated it on several benchmark tasks, demonstrating efficiency and scalability.
APA
Huang, L., Nie, J., Wang, J. & Zhong, S.. (2026). Bilevel Optimization for Hyperparameter Learning in Supporting Vector Machines. Proceedings of the 1st Conference on Topology, Algebra, and Geometry in Data Science(TAG-DS 2025), in Proceedings of Machine Learning Research 321:45-55 Available from https://proceedings.mlr.press/v321/huang26a.html.

Related Material