Theory and Algorithms for the Localized Setting of Learning Kernels

Yunwen Lei, Alexander Binder, Ürün Dogan, Marius Kloft
Proceedings of the 1st International Workshop on Feature Extraction: Modern Questions and Challenges at NIPS 2015, PMLR 44:173-195, 2015.

Abstract

We analyze the localized setting of learning kernels also known as localized multiple kernel learning. This problem has been addressed in the past using rather heuristic approaches based on approximately optimizing non-convex problem formulations, of which up to now no theoretical learning bounds are known. In this paper, we show generalization error bounds for learning localized kernel classes where the localities are coupled using graph-based regularization. We propose a novel learning localized kernels algorithm based on this hypothesis class that is formulated as a convex optimization problem using a pre-obtained cluster structure of the data. We derive dual representations using Fenchel conjugation theory, based on which we give a simple yet efficient wrapper-based optimization algorithm. We apply the method to problems involving multiple heterogeneous data sources, taken from domains of computational biology and computer vision. The results show that the proposed convex approach to learning localized kernels can achieve higher prediction accuracies than its global and non-convex local counterparts.

Cite this Paper


BibTeX
@InProceedings{pmlr-v44-LeiBinDogKlo15, title = {Theory and Algorithms for the Localized Setting of Learning Kernels}, author = {Lei, Yunwen and Binder, Alexander and Dogan, Ürün and Kloft, Marius}, booktitle = {Proceedings of the 1st International Workshop on Feature Extraction: Modern Questions and Challenges at NIPS 2015}, pages = {173--195}, year = {2015}, editor = {Storcheus, Dmitry and Rostamizadeh, Afshin and Kumar, Sanjiv}, volume = {44}, series = {Proceedings of Machine Learning Research}, address = {Montreal, Canada}, month = {11 Dec}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v44/LeiBinDogKlo15.pdf}, url = {https://proceedings.mlr.press/v44/LeiBinDogKlo15.html}, abstract = {We analyze the localized setting of learning kernels also known as localized multiple kernel learning. This problem has been addressed in the past using rather heuristic approaches based on approximately optimizing non-convex problem formulations, of which up to now no theoretical learning bounds are known. In this paper, we show generalization error bounds for learning localized kernel classes where the localities are coupled using graph-based regularization. We propose a novel learning localized kernels algorithm based on this hypothesis class that is formulated as a convex optimization problem using a pre-obtained cluster structure of the data. We derive dual representations using Fenchel conjugation theory, based on which we give a simple yet efficient wrapper-based optimization algorithm. We apply the method to problems involving multiple heterogeneous data sources, taken from domains of computational biology and computer vision. The results show that the proposed convex approach to learning localized kernels can achieve higher prediction accuracies than its global and non-convex local counterparts.} }
Endnote
%0 Conference Paper %T Theory and Algorithms for the Localized Setting of Learning Kernels %A Yunwen Lei %A Alexander Binder %A Ürün Dogan %A Marius Kloft %B Proceedings of the 1st International Workshop on Feature Extraction: Modern Questions and Challenges at NIPS 2015 %C Proceedings of Machine Learning Research %D 2015 %E Dmitry Storcheus %E Afshin Rostamizadeh %E Sanjiv Kumar %F pmlr-v44-LeiBinDogKlo15 %I PMLR %P 173--195 %U https://proceedings.mlr.press/v44/LeiBinDogKlo15.html %V 44 %X We analyze the localized setting of learning kernels also known as localized multiple kernel learning. This problem has been addressed in the past using rather heuristic approaches based on approximately optimizing non-convex problem formulations, of which up to now no theoretical learning bounds are known. In this paper, we show generalization error bounds for learning localized kernel classes where the localities are coupled using graph-based regularization. We propose a novel learning localized kernels algorithm based on this hypothesis class that is formulated as a convex optimization problem using a pre-obtained cluster structure of the data. We derive dual representations using Fenchel conjugation theory, based on which we give a simple yet efficient wrapper-based optimization algorithm. We apply the method to problems involving multiple heterogeneous data sources, taken from domains of computational biology and computer vision. The results show that the proposed convex approach to learning localized kernels can achieve higher prediction accuracies than its global and non-convex local counterparts.
RIS
TY - CPAPER TI - Theory and Algorithms for the Localized Setting of Learning Kernels AU - Yunwen Lei AU - Alexander Binder AU - Ürün Dogan AU - Marius Kloft BT - Proceedings of the 1st International Workshop on Feature Extraction: Modern Questions and Challenges at NIPS 2015 DA - 2015/12/08 ED - Dmitry Storcheus ED - Afshin Rostamizadeh ED - Sanjiv Kumar ID - pmlr-v44-LeiBinDogKlo15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 44 SP - 173 EP - 195 L1 - http://proceedings.mlr.press/v44/LeiBinDogKlo15.pdf UR - https://proceedings.mlr.press/v44/LeiBinDogKlo15.html AB - We analyze the localized setting of learning kernels also known as localized multiple kernel learning. This problem has been addressed in the past using rather heuristic approaches based on approximately optimizing non-convex problem formulations, of which up to now no theoretical learning bounds are known. In this paper, we show generalization error bounds for learning localized kernel classes where the localities are coupled using graph-based regularization. We propose a novel learning localized kernels algorithm based on this hypothesis class that is formulated as a convex optimization problem using a pre-obtained cluster structure of the data. We derive dual representations using Fenchel conjugation theory, based on which we give a simple yet efficient wrapper-based optimization algorithm. We apply the method to problems involving multiple heterogeneous data sources, taken from domains of computational biology and computer vision. The results show that the proposed convex approach to learning localized kernels can achieve higher prediction accuracies than its global and non-convex local counterparts. ER -
APA
Lei, Y., Binder, A., Dogan, Ü. & Kloft, M.. (2015). Theory and Algorithms for the Localized Setting of Learning Kernels. Proceedings of the 1st International Workshop on Feature Extraction: Modern Questions and Challenges at NIPS 2015, in Proceedings of Machine Learning Research 44:173-195 Available from https://proceedings.mlr.press/v44/LeiBinDogKlo15.html.

Related Material