Kernel Mean Estimation and Stein Effect

Krikamol Muandet, Kenji Fukumizu, Bharath Sriperumbudur, Arthur Gretton, Bernhard Schoelkopf
; Proceedings of the 31st International Conference on Machine Learning, PMLR 32(1):10-18, 2014.

Abstract

A mean function in reproducing kernel Hilbert space (RKHS), or a kernel mean, is an important part of many algorithms ranging from kernel principal component analysis to Hilbert-space embedding of distributions. Given a finite sample, an empirical average is the standard estimate for the true kernel mean. We show that this estimator can be improved due to a well-known phenomenon in statistics called Stein phenomenon. After consideration, our theoretical analysis reveals the existence of a wide class of estimators that are better than the standard one. Focusing on a subset of this class, we propose efficient shrinkage estimators for the kernel mean. Empirical evaluations on several applications clearly demonstrate that the proposed estimators outperform the standard kernel mean estimator.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-muandet14, title = {Kernel Mean Estimation and Stein Effect}, author = {Krikamol Muandet and Kenji Fukumizu and Bharath Sriperumbudur and Arthur Gretton and Bernhard Schoelkopf}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {10--18}, year = {2014}, editor = {Eric P. Xing and Tony Jebara}, volume = {32}, number = {1}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/muandet14.pdf}, url = {http://proceedings.mlr.press/v32/muandet14.html}, abstract = {A mean function in reproducing kernel Hilbert space (RKHS), or a kernel mean, is an important part of many algorithms ranging from kernel principal component analysis to Hilbert-space embedding of distributions. Given a finite sample, an empirical average is the standard estimate for the true kernel mean. We show that this estimator can be improved due to a well-known phenomenon in statistics called Stein phenomenon. After consideration, our theoretical analysis reveals the existence of a wide class of estimators that are better than the standard one. Focusing on a subset of this class, we propose efficient shrinkage estimators for the kernel mean. Empirical evaluations on several applications clearly demonstrate that the proposed estimators outperform the standard kernel mean estimator.} }
Endnote
%0 Conference Paper %T Kernel Mean Estimation and Stein Effect %A Krikamol Muandet %A Kenji Fukumizu %A Bharath Sriperumbudur %A Arthur Gretton %A Bernhard Schoelkopf %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-muandet14 %I PMLR %J Proceedings of Machine Learning Research %P 10--18 %U http://proceedings.mlr.press %V 32 %N 1 %W PMLR %X A mean function in reproducing kernel Hilbert space (RKHS), or a kernel mean, is an important part of many algorithms ranging from kernel principal component analysis to Hilbert-space embedding of distributions. Given a finite sample, an empirical average is the standard estimate for the true kernel mean. We show that this estimator can be improved due to a well-known phenomenon in statistics called Stein phenomenon. After consideration, our theoretical analysis reveals the existence of a wide class of estimators that are better than the standard one. Focusing on a subset of this class, we propose efficient shrinkage estimators for the kernel mean. Empirical evaluations on several applications clearly demonstrate that the proposed estimators outperform the standard kernel mean estimator.
RIS
TY - CPAPER TI - Kernel Mean Estimation and Stein Effect AU - Krikamol Muandet AU - Kenji Fukumizu AU - Bharath Sriperumbudur AU - Arthur Gretton AU - Bernhard Schoelkopf BT - Proceedings of the 31st International Conference on Machine Learning PY - 2014/01/27 DA - 2014/01/27 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-muandet14 PB - PMLR SP - 10 DP - PMLR EP - 18 L1 - http://proceedings.mlr.press/v32/muandet14.pdf UR - http://proceedings.mlr.press/v32/muandet14.html AB - A mean function in reproducing kernel Hilbert space (RKHS), or a kernel mean, is an important part of many algorithms ranging from kernel principal component analysis to Hilbert-space embedding of distributions. Given a finite sample, an empirical average is the standard estimate for the true kernel mean. We show that this estimator can be improved due to a well-known phenomenon in statistics called Stein phenomenon. After consideration, our theoretical analysis reveals the existence of a wide class of estimators that are better than the standard one. Focusing on a subset of this class, we propose efficient shrinkage estimators for the kernel mean. Empirical evaluations on several applications clearly demonstrate that the proposed estimators outperform the standard kernel mean estimator. ER -
APA
Muandet, K., Fukumizu, K., Sriperumbudur, B., Gretton, A. & Schoelkopf, B.. (2014). Kernel Mean Estimation and Stein Effect. Proceedings of the 31st International Conference on Machine Learning, in PMLR 32(1):10-18

Related Material