Kernel Mean Estimation and Stein Effect

Krikamol Muandet, Kenji Fukumizu, Bharath Sriperumbudur, Arthur Gretton, Bernhard Schoelkopf
Proceedings of the 31st International Conference on Machine Learning, PMLR 32(1):10-18, 2014.

Abstract

A mean function in reproducing kernel Hilbert space (RKHS), or a kernel mean, is an important part of many algorithms ranging from kernel principal component analysis to Hilbert-space embedding of distributions. Given a finite sample, an empirical average is the standard estimate for the true kernel mean. We show that this estimator can be improved due to a well-known phenomenon in statistics called Stein phenomenon. After consideration, our theoretical analysis reveals the existence of a wide class of estimators that are better than the standard one. Focusing on a subset of this class, we propose efficient shrinkage estimators for the kernel mean. Empirical evaluations on several applications clearly demonstrate that the proposed estimators outperform the standard kernel mean estimator.

Cite this Paper


BibTeX
@InProceedings{pmlr-v32-muandet14, title = {Kernel Mean Estimation and Stein Effect}, author = {Muandet, Krikamol and Fukumizu, Kenji and Sriperumbudur, Bharath and Gretton, Arthur and Schoelkopf, Bernhard}, booktitle = {Proceedings of the 31st International Conference on Machine Learning}, pages = {10--18}, year = {2014}, editor = {Xing, Eric P. and Jebara, Tony}, volume = {32}, number = {1}, series = {Proceedings of Machine Learning Research}, address = {Bejing, China}, month = {22--24 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v32/muandet14.pdf}, url = {https://proceedings.mlr.press/v32/muandet14.html}, abstract = {A mean function in reproducing kernel Hilbert space (RKHS), or a kernel mean, is an important part of many algorithms ranging from kernel principal component analysis to Hilbert-space embedding of distributions. Given a finite sample, an empirical average is the standard estimate for the true kernel mean. We show that this estimator can be improved due to a well-known phenomenon in statistics called Stein phenomenon. After consideration, our theoretical analysis reveals the existence of a wide class of estimators that are better than the standard one. Focusing on a subset of this class, we propose efficient shrinkage estimators for the kernel mean. Empirical evaluations on several applications clearly demonstrate that the proposed estimators outperform the standard kernel mean estimator.} }
Endnote
%0 Conference Paper %T Kernel Mean Estimation and Stein Effect %A Krikamol Muandet %A Kenji Fukumizu %A Bharath Sriperumbudur %A Arthur Gretton %A Bernhard Schoelkopf %B Proceedings of the 31st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2014 %E Eric P. Xing %E Tony Jebara %F pmlr-v32-muandet14 %I PMLR %P 10--18 %U https://proceedings.mlr.press/v32/muandet14.html %V 32 %N 1 %X A mean function in reproducing kernel Hilbert space (RKHS), or a kernel mean, is an important part of many algorithms ranging from kernel principal component analysis to Hilbert-space embedding of distributions. Given a finite sample, an empirical average is the standard estimate for the true kernel mean. We show that this estimator can be improved due to a well-known phenomenon in statistics called Stein phenomenon. After consideration, our theoretical analysis reveals the existence of a wide class of estimators that are better than the standard one. Focusing on a subset of this class, we propose efficient shrinkage estimators for the kernel mean. Empirical evaluations on several applications clearly demonstrate that the proposed estimators outperform the standard kernel mean estimator.
RIS
TY - CPAPER TI - Kernel Mean Estimation and Stein Effect AU - Krikamol Muandet AU - Kenji Fukumizu AU - Bharath Sriperumbudur AU - Arthur Gretton AU - Bernhard Schoelkopf BT - Proceedings of the 31st International Conference on Machine Learning DA - 2014/01/27 ED - Eric P. Xing ED - Tony Jebara ID - pmlr-v32-muandet14 PB - PMLR DP - Proceedings of Machine Learning Research VL - 32 IS - 1 SP - 10 EP - 18 L1 - http://proceedings.mlr.press/v32/muandet14.pdf UR - https://proceedings.mlr.press/v32/muandet14.html AB - A mean function in reproducing kernel Hilbert space (RKHS), or a kernel mean, is an important part of many algorithms ranging from kernel principal component analysis to Hilbert-space embedding of distributions. Given a finite sample, an empirical average is the standard estimate for the true kernel mean. We show that this estimator can be improved due to a well-known phenomenon in statistics called Stein phenomenon. After consideration, our theoretical analysis reveals the existence of a wide class of estimators that are better than the standard one. Focusing on a subset of this class, we propose efficient shrinkage estimators for the kernel mean. Empirical evaluations on several applications clearly demonstrate that the proposed estimators outperform the standard kernel mean estimator. ER -
APA
Muandet, K., Fukumizu, K., Sriperumbudur, B., Gretton, A. & Schoelkopf, B.. (2014). Kernel Mean Estimation and Stein Effect. Proceedings of the 31st International Conference on Machine Learning, in Proceedings of Machine Learning Research 32(1):10-18 Available from https://proceedings.mlr.press/v32/muandet14.html.

Related Material