Gang Wang,
Dit-Yan Yeung,
Frederick H. Lochovsky
;
Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, PMLR 2:580-587, 2007.
Abstract
Kernel methods implicitly map data points from the input space to some feature space where even relatively simple algorithms such as linear methods can deliver very impressive performance. Of crucial importance though is the choice of the kernel function, which determines the mapping between the input space and the feature space. The past few years have seen many efforts in learning either the kernel function or the kernel matrix. In this paper, we study the problem of learning the kernel hyperparameter in the context of the kernelized LASSO regression model. Specifically, we propose a solution path algorithm with respect to the hyperparameter of the kernel function. As the kernel hyperparameter changes its value, the solution path can be traced exactly without having to train the model multiple times. As a result, the optimal solution can be identified efficiently. Some simulation results will be presented to demonstrate the effectiveness of our proposed kernel path algorithm.
@InProceedings{pmlr-v2-wang07a,
title = {The Kernel Path in Kernelized LASSO},
author = {Gang Wang and Dit-Yan Yeung and Frederick H. Lochovsky},
booktitle = {Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics},
pages = {580--587},
year = {2007},
editor = {Marina Meila and Xiaotong Shen},
volume = {2},
series = {Proceedings of Machine Learning Research},
address = {San Juan, Puerto Rico},
month = {21--24 Mar},
publisher = {PMLR},
pdf = {http://proceedings.mlr.press/v2/wang07a/wang07a.pdf},
url = {http://proceedings.mlr.press/v2/wang07a.html},
abstract = {Kernel methods implicitly map data points from the input space to some feature space where even relatively simple algorithms such as linear methods can deliver very impressive performance. Of crucial importance though is the choice of the kernel function, which determines the mapping between the input space and the feature space. The past few years have seen many efforts in learning either the kernel function or the kernel matrix. In this paper, we study the problem of learning the kernel hyperparameter in the context of the kernelized LASSO regression model. Specifically, we propose a solution path algorithm with respect to the hyperparameter of the kernel function. As the kernel hyperparameter changes its value, the solution path can be traced exactly without having to train the model multiple times. As a result, the optimal solution can be identified efficiently. Some simulation results will be presented to demonstrate the effectiveness of our proposed kernel path algorithm.}
}
%0 Conference Paper
%T The Kernel Path in Kernelized LASSO
%A Gang Wang
%A Dit-Yan Yeung
%A Frederick H. Lochovsky
%B Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics
%C Proceedings of Machine Learning Research
%D 2007
%E Marina Meila
%E Xiaotong Shen
%F pmlr-v2-wang07a
%I PMLR
%J Proceedings of Machine Learning Research
%P 580--587
%U http://proceedings.mlr.press
%V 2
%W PMLR
%X Kernel methods implicitly map data points from the input space to some feature space where even relatively simple algorithms such as linear methods can deliver very impressive performance. Of crucial importance though is the choice of the kernel function, which determines the mapping between the input space and the feature space. The past few years have seen many efforts in learning either the kernel function or the kernel matrix. In this paper, we study the problem of learning the kernel hyperparameter in the context of the kernelized LASSO regression model. Specifically, we propose a solution path algorithm with respect to the hyperparameter of the kernel function. As the kernel hyperparameter changes its value, the solution path can be traced exactly without having to train the model multiple times. As a result, the optimal solution can be identified efficiently. Some simulation results will be presented to demonstrate the effectiveness of our proposed kernel path algorithm.
TY - CPAPER
TI - The Kernel Path in Kernelized LASSO
AU - Gang Wang
AU - Dit-Yan Yeung
AU - Frederick H. Lochovsky
BT - Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics
PY - 2007/03/11
DA - 2007/03/11
ED - Marina Meila
ED - Xiaotong Shen
ID - pmlr-v2-wang07a
PB - PMLR
SP - 580
DP - PMLR
EP - 587
L1 - http://proceedings.mlr.press/v2/wang07a/wang07a.pdf
UR - http://proceedings.mlr.press/v2/wang07a.html
AB - Kernel methods implicitly map data points from the input space to some feature space where even relatively simple algorithms such as linear methods can deliver very impressive performance. Of crucial importance though is the choice of the kernel function, which determines the mapping between the input space and the feature space. The past few years have seen many efforts in learning either the kernel function or the kernel matrix. In this paper, we study the problem of learning the kernel hyperparameter in the context of the kernelized LASSO regression model. Specifically, we propose a solution path algorithm with respect to the hyperparameter of the kernel function. As the kernel hyperparameter changes its value, the solution path can be traced exactly without having to train the model multiple times. As a result, the optimal solution can be identified efficiently. Some simulation results will be presented to demonstrate the effectiveness of our proposed kernel path algorithm.
ER -
Wang, G., Yeung, D. & Lochovsky, F.H.. (2007). The Kernel Path in Kernelized LASSO. Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, in PMLR 2:580-587
This site last compiled Sat, 14 Jul 2018 22:33:48 +0000