Local moment matching: A unified methodology for symmetric functional estimation and distribution estimation under Wasserstein distance

Yanjun Han, Jiantao Jiao, Tsachy Weissman
Proceedings of the 31st Conference On Learning Theory, PMLR 75:3189-3221, 2018.

Abstract

We present \emph{Local Moment Matching (LMM)}, a unified methodology for symmetric functional estimation and distribution estimation under Wasserstein distance. We construct an efficiently computable estimator that achieves the minimax rates in estimating the distribution up to permutation, and show that the plug-in approach of our unlabeled distribution estimator is “universal" in estimating symmetric functionals of discrete distributions. Instead of doing best polynomial approximation explicitly as in existing literature of functional estimation, the plug-in approach conducts polynomial approximation implicitly and attains the optimal sample complexity for the entropy, power sum and support size functionals.

Cite this Paper


BibTeX
@InProceedings{pmlr-v75-han18b, title = {Local moment matching: A unified methodology for symmetric functional estimation and distribution estimation under Wasserstein distance}, author = {Han, Yanjun and Jiao, Jiantao and Weissman, Tsachy}, booktitle = {Proceedings of the 31st Conference On Learning Theory}, pages = {3189--3221}, year = {2018}, editor = {Bubeck, Sébastien and Perchet, Vianney and Rigollet, Philippe}, volume = {75}, series = {Proceedings of Machine Learning Research}, month = {06--09 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v75/han18b/han18b.pdf}, url = {https://proceedings.mlr.press/v75/han18b.html}, abstract = {We present \emph{Local Moment Matching (LMM)}, a unified methodology for symmetric functional estimation and distribution estimation under Wasserstein distance. We construct an efficiently computable estimator that achieves the minimax rates in estimating the distribution up to permutation, and show that the plug-in approach of our unlabeled distribution estimator is “universal" in estimating symmetric functionals of discrete distributions. Instead of doing best polynomial approximation explicitly as in existing literature of functional estimation, the plug-in approach conducts polynomial approximation implicitly and attains the optimal sample complexity for the entropy, power sum and support size functionals.} }
Endnote
%0 Conference Paper %T Local moment matching: A unified methodology for symmetric functional estimation and distribution estimation under Wasserstein distance %A Yanjun Han %A Jiantao Jiao %A Tsachy Weissman %B Proceedings of the 31st Conference On Learning Theory %C Proceedings of Machine Learning Research %D 2018 %E Sébastien Bubeck %E Vianney Perchet %E Philippe Rigollet %F pmlr-v75-han18b %I PMLR %P 3189--3221 %U https://proceedings.mlr.press/v75/han18b.html %V 75 %X We present \emph{Local Moment Matching (LMM)}, a unified methodology for symmetric functional estimation and distribution estimation under Wasserstein distance. We construct an efficiently computable estimator that achieves the minimax rates in estimating the distribution up to permutation, and show that the plug-in approach of our unlabeled distribution estimator is “universal" in estimating symmetric functionals of discrete distributions. Instead of doing best polynomial approximation explicitly as in existing literature of functional estimation, the plug-in approach conducts polynomial approximation implicitly and attains the optimal sample complexity for the entropy, power sum and support size functionals.
APA
Han, Y., Jiao, J. & Weissman, T.. (2018). Local moment matching: A unified methodology for symmetric functional estimation and distribution estimation under Wasserstein distance. Proceedings of the 31st Conference On Learning Theory, in Proceedings of Machine Learning Research 75:3189-3221 Available from https://proceedings.mlr.press/v75/han18b.html.

Related Material