Fundamental limits of symmetric low-rank matrix estimation

Marc Lelarge, Léo Miolane
Proceedings of the 2017 Conference on Learning Theory, PMLR 65:1297-1301, 2017.

Abstract

We consider the high-dimensional inference problem where the signal is a low-rank symmetric matrix which is corrupted by an additive Gaussian noise. Given a probabilistic model for the low-rank matrix, we compute the limit in the large dimension setting for the mutual information between the signal and the observations, as well as the matrix minimum mean square error, while the rank of the signal remains constant. We unify and generalize a number of recent works on PCA, sparse PCA, submatrix localization or community detection by computing the information-theoretic limits for these problems in the high noise regime. This allows to locate precisely the information-theoretic thresholds for the above mentioned problems.

Cite this Paper


BibTeX
@InProceedings{pmlr-v65-lelarge17a, title = {Fundamental limits of symmetric low-rank matrix estimation}, author = {Lelarge, Marc and Miolane, Léo}, booktitle = {Proceedings of the 2017 Conference on Learning Theory}, pages = {1297--1301}, year = {2017}, editor = {Kale, Satyen and Shamir, Ohad}, volume = {65}, series = {Proceedings of Machine Learning Research}, month = {07--10 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v65/lelarge17a/lelarge17a.pdf}, url = {https://proceedings.mlr.press/v65/lelarge17a.html}, abstract = {We consider the high-dimensional inference problem where the signal is a low-rank symmetric matrix which is corrupted by an additive Gaussian noise. Given a probabilistic model for the low-rank matrix, we compute the limit in the large dimension setting for the mutual information between the signal and the observations, as well as the matrix minimum mean square error, while the rank of the signal remains constant. We unify and generalize a number of recent works on PCA, sparse PCA, submatrix localization or community detection by computing the information-theoretic limits for these problems in the high noise regime. This allows to locate precisely the information-theoretic thresholds for the above mentioned problems.} }
Endnote
%0 Conference Paper %T Fundamental limits of symmetric low-rank matrix estimation %A Marc Lelarge %A Léo Miolane %B Proceedings of the 2017 Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2017 %E Satyen Kale %E Ohad Shamir %F pmlr-v65-lelarge17a %I PMLR %P 1297--1301 %U https://proceedings.mlr.press/v65/lelarge17a.html %V 65 %X We consider the high-dimensional inference problem where the signal is a low-rank symmetric matrix which is corrupted by an additive Gaussian noise. Given a probabilistic model for the low-rank matrix, we compute the limit in the large dimension setting for the mutual information between the signal and the observations, as well as the matrix minimum mean square error, while the rank of the signal remains constant. We unify and generalize a number of recent works on PCA, sparse PCA, submatrix localization or community detection by computing the information-theoretic limits for these problems in the high noise regime. This allows to locate precisely the information-theoretic thresholds for the above mentioned problems.
APA
Lelarge, M. & Miolane, L.. (2017). Fundamental limits of symmetric low-rank matrix estimation. Proceedings of the 2017 Conference on Learning Theory, in Proceedings of Machine Learning Research 65:1297-1301 Available from https://proceedings.mlr.press/v65/lelarge17a.html.

Related Material