Compressed Maximum Likelihood

Yi Hao, Alon Orlitsky
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:4085-4095, 2021.

Abstract

Maximum likelihood (ML) is one of the most fundamental and general statistical estimation techniques. Inspired by recent advances in estimating distribution functionals, we propose compressed maximum likelihood (CML) that applies ML to the compressed samples. We then show that CML is sample-efficient for several essential learning tasks over both discrete and continuous domains, including learning densities with structures, estimating probability multisets, and inferring symmetric distribution functionals.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-hao21c, title = {Compressed Maximum Likelihood}, author = {Hao, Yi and Orlitsky, Alon}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {4085--4095}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/hao21c/hao21c.pdf}, url = {https://proceedings.mlr.press/v139/hao21c.html}, abstract = {Maximum likelihood (ML) is one of the most fundamental and general statistical estimation techniques. Inspired by recent advances in estimating distribution functionals, we propose $\textit{compressed maximum likelihood}$ (CML) that applies ML to the compressed samples. We then show that CML is sample-efficient for several essential learning tasks over both discrete and continuous domains, including learning densities with structures, estimating probability multisets, and inferring symmetric distribution functionals.} }
Endnote
%0 Conference Paper %T Compressed Maximum Likelihood %A Yi Hao %A Alon Orlitsky %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-hao21c %I PMLR %P 4085--4095 %U https://proceedings.mlr.press/v139/hao21c.html %V 139 %X Maximum likelihood (ML) is one of the most fundamental and general statistical estimation techniques. Inspired by recent advances in estimating distribution functionals, we propose $\textit{compressed maximum likelihood}$ (CML) that applies ML to the compressed samples. We then show that CML is sample-efficient for several essential learning tasks over both discrete and continuous domains, including learning densities with structures, estimating probability multisets, and inferring symmetric distribution functionals.
APA
Hao, Y. & Orlitsky, A.. (2021). Compressed Maximum Likelihood. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:4085-4095 Available from https://proceedings.mlr.press/v139/hao21c.html.

Related Material