Learning low-rank output kernels

Francesco Dinuzzo, Kenji Fukumizu
Proceedings of the Asian Conference on Machine Learning, PMLR 20:181-196, 2011.

Abstract

Output kernel learning techniques allow to simultaneously learn a vector-valued function and a positive semidefinite matrix which describes the relationships between the outputs. In this paper, we introduce a new formulation that imposes a low-rank constraint on the output kernel and operates directly on a factor of the kernel matrix. First, we investigate the connection between output kernel learning and a regularization problem for an architecture with two layers. Then, we show that a variety of methods such as nuclear norm regularized regression, reduced-rank regression, principal component analysis, and low rank matrix approximation can be seen as special cases of the output kernel learning framework. Finally, we introduce a block coordinate descent strategy for learning low-rank output kernels.

Cite this Paper


BibTeX
@InProceedings{pmlr-v20-dinuzzo11, title = {Learning low-rank output kernels}, author = {Dinuzzo, Francesco and Fukumizu, Kenji}, booktitle = {Proceedings of the Asian Conference on Machine Learning}, pages = {181--196}, year = {2011}, editor = {Hsu, Chun-Nan and Lee, Wee Sun}, volume = {20}, series = {Proceedings of Machine Learning Research}, address = {South Garden Hotels and Resorts, Taoyuan, Taiwain}, month = {14--15 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v20/dinuzzo11/dinuzzo11.pdf}, url = {https://proceedings.mlr.press/v20/dinuzzo11.html}, abstract = {Output kernel learning techniques allow to simultaneously learn a vector-valued function and a positive semidefinite matrix which describes the relationships between the outputs. In this paper, we introduce a new formulation that imposes a low-rank constraint on the output kernel and operates directly on a factor of the kernel matrix. First, we investigate the connection between output kernel learning and a regularization problem for an architecture with two layers. Then, we show that a variety of methods such as nuclear norm regularized regression, reduced-rank regression, principal component analysis, and low rank matrix approximation can be seen as special cases of the output kernel learning framework. Finally, we introduce a block coordinate descent strategy for learning low-rank output kernels.} }
Endnote
%0 Conference Paper %T Learning low-rank output kernels %A Francesco Dinuzzo %A Kenji Fukumizu %B Proceedings of the Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2011 %E Chun-Nan Hsu %E Wee Sun Lee %F pmlr-v20-dinuzzo11 %I PMLR %P 181--196 %U https://proceedings.mlr.press/v20/dinuzzo11.html %V 20 %X Output kernel learning techniques allow to simultaneously learn a vector-valued function and a positive semidefinite matrix which describes the relationships between the outputs. In this paper, we introduce a new formulation that imposes a low-rank constraint on the output kernel and operates directly on a factor of the kernel matrix. First, we investigate the connection between output kernel learning and a regularization problem for an architecture with two layers. Then, we show that a variety of methods such as nuclear norm regularized regression, reduced-rank regression, principal component analysis, and low rank matrix approximation can be seen as special cases of the output kernel learning framework. Finally, we introduce a block coordinate descent strategy for learning low-rank output kernels.
RIS
TY - CPAPER TI - Learning low-rank output kernels AU - Francesco Dinuzzo AU - Kenji Fukumizu BT - Proceedings of the Asian Conference on Machine Learning DA - 2011/11/17 ED - Chun-Nan Hsu ED - Wee Sun Lee ID - pmlr-v20-dinuzzo11 PB - PMLR DP - Proceedings of Machine Learning Research VL - 20 SP - 181 EP - 196 L1 - http://proceedings.mlr.press/v20/dinuzzo11/dinuzzo11.pdf UR - https://proceedings.mlr.press/v20/dinuzzo11.html AB - Output kernel learning techniques allow to simultaneously learn a vector-valued function and a positive semidefinite matrix which describes the relationships between the outputs. In this paper, we introduce a new formulation that imposes a low-rank constraint on the output kernel and operates directly on a factor of the kernel matrix. First, we investigate the connection between output kernel learning and a regularization problem for an architecture with two layers. Then, we show that a variety of methods such as nuclear norm regularized regression, reduced-rank regression, principal component analysis, and low rank matrix approximation can be seen as special cases of the output kernel learning framework. Finally, we introduce a block coordinate descent strategy for learning low-rank output kernels. ER -
APA
Dinuzzo, F. & Fukumizu, K.. (2011). Learning low-rank output kernels. Proceedings of the Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 20:181-196 Available from https://proceedings.mlr.press/v20/dinuzzo11.html.

Related Material