Jinfeng Zhuang,
Jialei Wang,
Steven C. H. Hoi,
Xiangyang Lan
;
Proceedings of the Asian Conference on Machine Learning, PMLR 20:129-144, 2011.
Abstract
Traditional multiple kernel learning (MKL) algorithms are essentially supervised learning in the sense that the kernel learning task requires the class labels of training data. However, class labels may not always be available prior to the kernel learning task in some real world scenarios, e.g., an early preprocessing step of a classification task or an unsupervised learning task such as dimension reduction. In this paper, we investigate a problem of Unsupervised Multiple Kernel Learning (UMKL), which does not require class labels of training data as needed in a conventional multiple kernel learning task. Since a kernel essentially defines pairwise similarity between any two examples, our unsupervised kernel learning method mainly follows two intuitive principles: (1) a good kernel should allow every example to be well reconstructed from its localized bases weighted by the kernel values; (2) a good kernel should induce kernel values that are coincided with the local geometry of the data. We formulate the unsupervised multiple kernel learning problem as an optimization task and propose an efficient alternating optimization algorithm to solve it. Empirical results on both classification and dimension reductions tasks validate the efficacy of the proposed UMKL algorithm.
@InProceedings{pmlr-v20-zhuang11,
title = {Unsupervised Multiple Kernel Learning},
author = {Jinfeng Zhuang and Jialei Wang and Steven C. H. Hoi and Xiangyang Lan},
booktitle = {Proceedings of the Asian Conference on Machine Learning},
pages = {129--144},
year = {2011},
editor = {Chun-Nan Hsu and Wee Sun Lee},
volume = {20},
series = {Proceedings of Machine Learning Research},
address = {South Garden Hotels and Resorts, Taoyuan, Taiwain},
month = {14--15 Nov},
publisher = {PMLR},
pdf = {http://proceedings.mlr.press/v20/zhuang11/zhuang11.pdf},
url = {http://proceedings.mlr.press/v20/zhuang11.html},
abstract = {Traditional multiple kernel learning (MKL) algorithms are essentially supervised learning in the sense that the kernel learning task requires the class labels of training data. However, class labels may not always be available prior to the kernel learning task in some real world scenarios, e.g., an early preprocessing step of a classification task or an unsupervised learning task such as dimension reduction. In this paper, we investigate a problem of Unsupervised Multiple Kernel Learning (UMKL), which does not require class labels of training data as needed in a conventional multiple kernel learning task. Since a kernel essentially defines pairwise similarity between any two examples, our unsupervised kernel learning method mainly follows two intuitive principles: (1) a good kernel should allow every example to be well reconstructed from its localized bases weighted by the kernel values; (2) a good kernel should induce kernel values that are coincided with the local geometry of the data. We formulate the unsupervised multiple kernel learning problem as an optimization task and propose an efficient alternating optimization algorithm to solve it. Empirical results on both classification and dimension reductions tasks validate the efficacy of the proposed UMKL algorithm.}
}
%0 Conference Paper
%T Unsupervised Multiple Kernel Learning
%A Jinfeng Zhuang
%A Jialei Wang
%A Steven C. H. Hoi
%A Xiangyang Lan
%B Proceedings of the Asian Conference on Machine Learning
%C Proceedings of Machine Learning Research
%D 2011
%E Chun-Nan Hsu
%E Wee Sun Lee
%F pmlr-v20-zhuang11
%I PMLR
%J Proceedings of Machine Learning Research
%P 129--144
%U http://proceedings.mlr.press
%V 20
%W PMLR
%X Traditional multiple kernel learning (MKL) algorithms are essentially supervised learning in the sense that the kernel learning task requires the class labels of training data. However, class labels may not always be available prior to the kernel learning task in some real world scenarios, e.g., an early preprocessing step of a classification task or an unsupervised learning task such as dimension reduction. In this paper, we investigate a problem of Unsupervised Multiple Kernel Learning (UMKL), which does not require class labels of training data as needed in a conventional multiple kernel learning task. Since a kernel essentially defines pairwise similarity between any two examples, our unsupervised kernel learning method mainly follows two intuitive principles: (1) a good kernel should allow every example to be well reconstructed from its localized bases weighted by the kernel values; (2) a good kernel should induce kernel values that are coincided with the local geometry of the data. We formulate the unsupervised multiple kernel learning problem as an optimization task and propose an efficient alternating optimization algorithm to solve it. Empirical results on both classification and dimension reductions tasks validate the efficacy of the proposed UMKL algorithm.
TY - CPAPER
TI - Unsupervised Multiple Kernel Learning
AU - Jinfeng Zhuang
AU - Jialei Wang
AU - Steven C. H. Hoi
AU - Xiangyang Lan
BT - Proceedings of the Asian Conference on Machine Learning
PY - 2011/11/17
DA - 2011/11/17
ED - Chun-Nan Hsu
ED - Wee Sun Lee
ID - pmlr-v20-zhuang11
PB - PMLR
SP - 129
DP - PMLR
EP - 144
L1 - http://proceedings.mlr.press/v20/zhuang11/zhuang11.pdf
UR - http://proceedings.mlr.press/v20/zhuang11.html
AB - Traditional multiple kernel learning (MKL) algorithms are essentially supervised learning in the sense that the kernel learning task requires the class labels of training data. However, class labels may not always be available prior to the kernel learning task in some real world scenarios, e.g., an early preprocessing step of a classification task or an unsupervised learning task such as dimension reduction. In this paper, we investigate a problem of Unsupervised Multiple Kernel Learning (UMKL), which does not require class labels of training data as needed in a conventional multiple kernel learning task. Since a kernel essentially defines pairwise similarity between any two examples, our unsupervised kernel learning method mainly follows two intuitive principles: (1) a good kernel should allow every example to be well reconstructed from its localized bases weighted by the kernel values; (2) a good kernel should induce kernel values that are coincided with the local geometry of the data. We formulate the unsupervised multiple kernel learning problem as an optimization task and propose an efficient alternating optimization algorithm to solve it. Empirical results on both classification and dimension reductions tasks validate the efficacy of the proposed UMKL algorithm.
ER -
Zhuang, J., Wang, J., Hoi, S.C.H. & Lan, X.. (2011). Unsupervised Multiple Kernel Learning. Proceedings of the Asian Conference on Machine Learning, in PMLR 20:129-144
This site last compiled Mon, 16 Jul 2018 07:35:01 +0000