Joker: Joint Optimization Framework for Lightweight Kernel Machines

Junhong Zhang, Zhihui Lai
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:75297-75313, 2025.

Abstract

Kernel methods are powerful tools for nonlinear learning with well-established theory. The scalability issue has been their long-standing challenge. Despite the existing success, there are two limitations in large-scale kernel methods: (i) The memory overhead is too high for users to afford; (ii) existing efforts mainly focus on kernel ridge regression (KRR), while other models lack study. In this paper, we propose Joker, a joint optimization framework for diverse kernel models, including KRR, logistic regression, and support vector machines. We design a dual block coordinate descent method with trust region (DBCD-TR) and adopt kernel approximation with randomized features, leading to low memory costs and high efficiency in large-scale learning. Experiments show that Joker saves up to 90% memory but achieves comparable training time and performance (or even better) than the state-of-the-art methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-zhang25an, title = {Joker: Joint Optimization Framework for Lightweight Kernel Machines}, author = {Zhang, Junhong and Lai, Zhihui}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {75297--75313}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/zhang25an/zhang25an.pdf}, url = {https://proceedings.mlr.press/v267/zhang25an.html}, abstract = {Kernel methods are powerful tools for nonlinear learning with well-established theory. The scalability issue has been their long-standing challenge. Despite the existing success, there are two limitations in large-scale kernel methods: (i) The memory overhead is too high for users to afford; (ii) existing efforts mainly focus on kernel ridge regression (KRR), while other models lack study. In this paper, we propose Joker, a joint optimization framework for diverse kernel models, including KRR, logistic regression, and support vector machines. We design a dual block coordinate descent method with trust region (DBCD-TR) and adopt kernel approximation with randomized features, leading to low memory costs and high efficiency in large-scale learning. Experiments show that Joker saves up to 90% memory but achieves comparable training time and performance (or even better) than the state-of-the-art methods.} }
Endnote
%0 Conference Paper %T Joker: Joint Optimization Framework for Lightweight Kernel Machines %A Junhong Zhang %A Zhihui Lai %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-zhang25an %I PMLR %P 75297--75313 %U https://proceedings.mlr.press/v267/zhang25an.html %V 267 %X Kernel methods are powerful tools for nonlinear learning with well-established theory. The scalability issue has been their long-standing challenge. Despite the existing success, there are two limitations in large-scale kernel methods: (i) The memory overhead is too high for users to afford; (ii) existing efforts mainly focus on kernel ridge regression (KRR), while other models lack study. In this paper, we propose Joker, a joint optimization framework for diverse kernel models, including KRR, logistic regression, and support vector machines. We design a dual block coordinate descent method with trust region (DBCD-TR) and adopt kernel approximation with randomized features, leading to low memory costs and high efficiency in large-scale learning. Experiments show that Joker saves up to 90% memory but achieves comparable training time and performance (or even better) than the state-of-the-art methods.
APA
Zhang, J. & Lai, Z.. (2025). Joker: Joint Optimization Framework for Lightweight Kernel Machines. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:75297-75313 Available from https://proceedings.mlr.press/v267/zhang25an.html.

Related Material