Exact asymptotics for phase retrieval and compressed sensing with random generative priors

Benjamin Aubin, Bruno Loureiro, Antoine Baker, Florent Krzakala, Lenka Zdeborová
Proceedings of The First Mathematical and Scientific Machine Learning Conference, PMLR 107:55-73, 2020.

Abstract

We consider the problem of compressed sensing and of (real-valued) phase retrieval with random measurement matrix. We derive sharp asymptotics for the information-theoretically optimal performance and for the best known polynomial algorithm for an ensemble of generative priors consisting of fully connected deep neural networks with random weight matrices and arbitrary activations. We compare the performance to sparse separable priors and conclude that in all cases analysed generative priors have a smaller statistical-to-algorithmic gap than sparse priors, giving theoretical support to previous experimental observations that generative priors might be advantageous in terms of algorithmic performance. In particular, while sparsity does not allow to perform compressive phase retrieval efficiently close to its information-theoretic limit, it is found that under the random generative prior compressed phase retrieval becomes tractable.

Cite this Paper


BibTeX
@InProceedings{pmlr-v107-aubin20a, title = {Exact asymptotics for phase retrieval and compressed sensing with random generative priors}, author = {Aubin, Benjamin and Loureiro, Bruno and Baker, Antoine and Krzakala, Florent and Zdeborov\'a, Lenka}, booktitle = {Proceedings of The First Mathematical and Scientific Machine Learning Conference}, pages = {55--73}, year = {2020}, editor = {Lu, Jianfeng and Ward, Rachel}, volume = {107}, series = {Proceedings of Machine Learning Research}, month = {20--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v107/aubin20a/aubin20a.pdf}, url = {https://proceedings.mlr.press/v107/aubin20a.html}, abstract = {We consider the problem of compressed sensing and of (real-valued) phase retrieval with random measurement matrix. We derive sharp asymptotics for the information-theoretically optimal performance and for the best known polynomial algorithm for an ensemble of generative priors consisting of fully connected deep neural networks with random weight matrices and arbitrary activations. We compare the performance to sparse separable priors and conclude that in all cases analysed generative priors have a smaller statistical-to-algorithmic gap than sparse priors, giving theoretical support to previous experimental observations that generative priors might be advantageous in terms of algorithmic performance. In particular, while sparsity does not allow to perform compressive phase retrieval efficiently close to its information-theoretic limit, it is found that under the random generative prior compressed phase retrieval becomes tractable. } }
Endnote
%0 Conference Paper %T Exact asymptotics for phase retrieval and compressed sensing with random generative priors %A Benjamin Aubin %A Bruno Loureiro %A Antoine Baker %A Florent Krzakala %A Lenka Zdeborová %B Proceedings of The First Mathematical and Scientific Machine Learning Conference %C Proceedings of Machine Learning Research %D 2020 %E Jianfeng Lu %E Rachel Ward %F pmlr-v107-aubin20a %I PMLR %P 55--73 %U https://proceedings.mlr.press/v107/aubin20a.html %V 107 %X We consider the problem of compressed sensing and of (real-valued) phase retrieval with random measurement matrix. We derive sharp asymptotics for the information-theoretically optimal performance and for the best known polynomial algorithm for an ensemble of generative priors consisting of fully connected deep neural networks with random weight matrices and arbitrary activations. We compare the performance to sparse separable priors and conclude that in all cases analysed generative priors have a smaller statistical-to-algorithmic gap than sparse priors, giving theoretical support to previous experimental observations that generative priors might be advantageous in terms of algorithmic performance. In particular, while sparsity does not allow to perform compressive phase retrieval efficiently close to its information-theoretic limit, it is found that under the random generative prior compressed phase retrieval becomes tractable.
APA
Aubin, B., Loureiro, B., Baker, A., Krzakala, F. & Zdeborová, L.. (2020). Exact asymptotics for phase retrieval and compressed sensing with random generative priors. Proceedings of The First Mathematical and Scientific Machine Learning Conference, in Proceedings of Machine Learning Research 107:55-73 Available from https://proceedings.mlr.press/v107/aubin20a.html.

Related Material