Gradient Descent in RKHS with Importance Labeling

Tomoya Murata, Taiji Suzuki
Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, PMLR 130:1981-1989, 2021.

Abstract

Labeling cost is often expensive and is a fundamental limitation of supervised learning. In this paper, we study importance labeling problem, in which we are given many unlabeled data and select a limited number of data to be labeled from the unlabeled data, and then a learning algorithm is executed on the selected one. We propose a new importance labeling scheme that can effectively select an informative subset of unlabeled data in least squares regression in Reproducing Kernel Hilbert Spaces (RKHS). We analyze the generalization error of gradient descent combined with our labeling scheme and show that the proposed algorithm achieves the optimal rate of convergence in much wider settings and especially gives much better generalization ability in a small noise setting than the usual uniform sampling scheme. Numerical experiments verify our theoretical findings.

Cite this Paper


BibTeX
@InProceedings{pmlr-v130-murata21a, title = { Gradient Descent in RKHS with Importance Labeling }, author = {Murata, Tomoya and Suzuki, Taiji}, booktitle = {Proceedings of The 24th International Conference on Artificial Intelligence and Statistics}, pages = {1981--1989}, year = {2021}, editor = {Banerjee, Arindam and Fukumizu, Kenji}, volume = {130}, series = {Proceedings of Machine Learning Research}, month = {13--15 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v130/murata21a/murata21a.pdf}, url = {https://proceedings.mlr.press/v130/murata21a.html}, abstract = { Labeling cost is often expensive and is a fundamental limitation of supervised learning. In this paper, we study importance labeling problem, in which we are given many unlabeled data and select a limited number of data to be labeled from the unlabeled data, and then a learning algorithm is executed on the selected one. We propose a new importance labeling scheme that can effectively select an informative subset of unlabeled data in least squares regression in Reproducing Kernel Hilbert Spaces (RKHS). We analyze the generalization error of gradient descent combined with our labeling scheme and show that the proposed algorithm achieves the optimal rate of convergence in much wider settings and especially gives much better generalization ability in a small noise setting than the usual uniform sampling scheme. Numerical experiments verify our theoretical findings. } }
Endnote
%0 Conference Paper %T Gradient Descent in RKHS with Importance Labeling %A Tomoya Murata %A Taiji Suzuki %B Proceedings of The 24th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2021 %E Arindam Banerjee %E Kenji Fukumizu %F pmlr-v130-murata21a %I PMLR %P 1981--1989 %U https://proceedings.mlr.press/v130/murata21a.html %V 130 %X Labeling cost is often expensive and is a fundamental limitation of supervised learning. In this paper, we study importance labeling problem, in which we are given many unlabeled data and select a limited number of data to be labeled from the unlabeled data, and then a learning algorithm is executed on the selected one. We propose a new importance labeling scheme that can effectively select an informative subset of unlabeled data in least squares regression in Reproducing Kernel Hilbert Spaces (RKHS). We analyze the generalization error of gradient descent combined with our labeling scheme and show that the proposed algorithm achieves the optimal rate of convergence in much wider settings and especially gives much better generalization ability in a small noise setting than the usual uniform sampling scheme. Numerical experiments verify our theoretical findings.
APA
Murata, T. & Suzuki, T.. (2021). Gradient Descent in RKHS with Importance Labeling . Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 130:1981-1989 Available from https://proceedings.mlr.press/v130/murata21a.html.

Related Material