A Geometric Algorithm for Scalable Multiple Kernel Learning

John Moeller, Parasaran Raman, Suresh Venkatasubramanian, Avishek Saha
; Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics, PMLR 33:633-642, 2014.

Abstract

We present a geometric formulation of the Multiple Kernel Learning (MKL) problem. To do so, we reinterpret the problem of learning kernel weights as searching for a kernel that maximizes the minimum (kernel) distance between two convex polytopes. This interpretation combined with additional structural insights from our geometric formulation allows us to reduce the MKL problem to a simple optimization routine that yields provable convergence as well as quality guarantees. As a result our method scales efficiently to much larger data sets than most prior methods can handle. Empirical evaluation on eleven datasets shows that we are significantly faster and even compare favorably with an uniform unweighted combination of kernels.

Cite this Paper


BibTeX
@InProceedings{pmlr-v33-moeller14, title = {{A Geometric Algorithm for Scalable Multiple Kernel Learning}}, author = {John Moeller and Parasaran Raman and Suresh Venkatasubramanian and Avishek Saha}, booktitle = {Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics}, pages = {633--642}, year = {2014}, editor = {Samuel Kaski and Jukka Corander}, volume = {33}, series = {Proceedings of Machine Learning Research}, address = {Reykjavik, Iceland}, month = {22--25 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v33/moeller14.pdf}, url = {http://proceedings.mlr.press/v33/moeller14.html}, abstract = {We present a geometric formulation of the Multiple Kernel Learning (MKL) problem. To do so, we reinterpret the problem of learning kernel weights as searching for a kernel that maximizes the minimum (kernel) distance between two convex polytopes. This interpretation combined with additional structural insights from our geometric formulation allows us to reduce the MKL problem to a simple optimization routine that yields provable convergence as well as quality guarantees. As a result our method scales efficiently to much larger data sets than most prior methods can handle. Empirical evaluation on eleven datasets shows that we are significantly faster and even compare favorably with an uniform unweighted combination of kernels.} }
Endnote
%0 Conference Paper %T A Geometric Algorithm for Scalable Multiple Kernel Learning %A John Moeller %A Parasaran Raman %A Suresh Venkatasubramanian %A Avishek Saha %B Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2014 %E Samuel Kaski %E Jukka Corander %F pmlr-v33-moeller14 %I PMLR %J Proceedings of Machine Learning Research %P 633--642 %U http://proceedings.mlr.press %V 33 %W PMLR %X We present a geometric formulation of the Multiple Kernel Learning (MKL) problem. To do so, we reinterpret the problem of learning kernel weights as searching for a kernel that maximizes the minimum (kernel) distance between two convex polytopes. This interpretation combined with additional structural insights from our geometric formulation allows us to reduce the MKL problem to a simple optimization routine that yields provable convergence as well as quality guarantees. As a result our method scales efficiently to much larger data sets than most prior methods can handle. Empirical evaluation on eleven datasets shows that we are significantly faster and even compare favorably with an uniform unweighted combination of kernels.
RIS
TY - CPAPER TI - A Geometric Algorithm for Scalable Multiple Kernel Learning AU - John Moeller AU - Parasaran Raman AU - Suresh Venkatasubramanian AU - Avishek Saha BT - Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics PY - 2014/04/02 DA - 2014/04/02 ED - Samuel Kaski ED - Jukka Corander ID - pmlr-v33-moeller14 PB - PMLR SP - 633 DP - PMLR EP - 642 L1 - http://proceedings.mlr.press/v33/moeller14.pdf UR - http://proceedings.mlr.press/v33/moeller14.html AB - We present a geometric formulation of the Multiple Kernel Learning (MKL) problem. To do so, we reinterpret the problem of learning kernel weights as searching for a kernel that maximizes the minimum (kernel) distance between two convex polytopes. This interpretation combined with additional structural insights from our geometric formulation allows us to reduce the MKL problem to a simple optimization routine that yields provable convergence as well as quality guarantees. As a result our method scales efficiently to much larger data sets than most prior methods can handle. Empirical evaluation on eleven datasets shows that we are significantly faster and even compare favorably with an uniform unweighted combination of kernels. ER -
APA
Moeller, J., Raman, P., Venkatasubramanian, S. & Saha, A.. (2014). A Geometric Algorithm for Scalable Multiple Kernel Learning. Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics, in PMLR 33:633-642

Related Material