Computationally efficient reductions between some statistical models

Mengqi Lou, Guy Bresler, Ashwin Pananjady
Proceedings of The 36th International Conference on Algorithmic Learning Theory, PMLR 272:771-771, 2025.

Abstract

We study the problem of approximately transforming a sample from a source statistical model to a sample from a target statistical model without knowing the parameters of the source model, and construct several computationally efficient such reductions between canonical statistical experiments. In particular, we provide computationally efficient procedures that approximately reduce uniform, Erlang, and Laplace location models to general target families. We illustrate our methodology by establishing nonasymptotic reductions between some canonical high-dimensional problems, spanning mixtures of experts, phase retrieval, and signal denoising. Notably, the reductions are structure-preserving and can accommodate missing data. We also point to a possible application in transforming one differentially private mechanism to another.

Cite this Paper


BibTeX
@InProceedings{pmlr-v272-lou25a, title = {Computationally efficient reductions between some statistical models}, author = {Lou, Mengqi and Bresler, Guy and Pananjady, Ashwin}, booktitle = {Proceedings of The 36th International Conference on Algorithmic Learning Theory}, pages = {771--771}, year = {2025}, editor = {Kamath, Gautam and Loh, Po-Ling}, volume = {272}, series = {Proceedings of Machine Learning Research}, month = {24--27 Feb}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v272/main/assets/lou25a/lou25a.pdf}, url = {https://proceedings.mlr.press/v272/lou25a.html}, abstract = {We study the problem of approximately transforming a sample from a source statistical model to a sample from a target statistical model without knowing the parameters of the source model, and construct several computationally efficient such reductions between canonical statistical experiments. In particular, we provide computationally efficient procedures that approximately reduce uniform, Erlang, and Laplace location models to general target families. We illustrate our methodology by establishing nonasymptotic reductions between some canonical high-dimensional problems, spanning mixtures of experts, phase retrieval, and signal denoising. Notably, the reductions are structure-preserving and can accommodate missing data. We also point to a possible application in transforming one differentially private mechanism to another.} }
Endnote
%0 Conference Paper %T Computationally efficient reductions between some statistical models %A Mengqi Lou %A Guy Bresler %A Ashwin Pananjady %B Proceedings of The 36th International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2025 %E Gautam Kamath %E Po-Ling Loh %F pmlr-v272-lou25a %I PMLR %P 771--771 %U https://proceedings.mlr.press/v272/lou25a.html %V 272 %X We study the problem of approximately transforming a sample from a source statistical model to a sample from a target statistical model without knowing the parameters of the source model, and construct several computationally efficient such reductions between canonical statistical experiments. In particular, we provide computationally efficient procedures that approximately reduce uniform, Erlang, and Laplace location models to general target families. We illustrate our methodology by establishing nonasymptotic reductions between some canonical high-dimensional problems, spanning mixtures of experts, phase retrieval, and signal denoising. Notably, the reductions are structure-preserving and can accommodate missing data. We also point to a possible application in transforming one differentially private mechanism to another.
APA
Lou, M., Bresler, G. & Pananjady, A.. (2025). Computationally efficient reductions between some statistical models. Proceedings of The 36th International Conference on Algorithmic Learning Theory, in Proceedings of Machine Learning Research 272:771-771 Available from https://proceedings.mlr.press/v272/lou25a.html.

Related Material