Optimal denoising of rotationally invariant rectangular matrices

Emanuele Troiani, Vittorio Erba, FLORENT KRZAKALA, Antoine Maillard, Lenka Zdeborova
Proceedings of Mathematical and Scientific Machine Learning, PMLR 190:97-112, 2022.

Abstract

In this manuscript we consider denoising of large rectangular matrices: given a noisy observation of a signal matrix, what is the best way of recovering the signal matrix itself? For Gaussian noise and rotationally-invariant signal priors, we completely characterize the optimal denoiser and its performance in the high-dimensional limit, in which the size of the signal matrix goes to infinity with fixed aspects ratio, and under the Bayes optimal setting, that is when the statistician knows how the signal and the observations were generated. Our results generalise previous works that considered only symmetric matrices to the more general case of non-symmetric and rectangular ones. We explore analytically and numerically a particular choice of factorized signal prior that models cross-covariance matrices and the matrix factorization problem. As a byproduct of our analysis, we provide an explicit asymptotic evaluation of the rectangular Harish-Chandra-Itzykson-Zuber integral in a special case.

Cite this Paper


BibTeX
@InProceedings{pmlr-v190-troiani22a, title = {Optimal denoising of rotationally invariant rectangular matrices}, author = {Troiani, Emanuele and Erba, Vittorio and KRZAKALA, FLORENT and Maillard, Antoine and Zdeborova, Lenka}, booktitle = {Proceedings of Mathematical and Scientific Machine Learning}, pages = {97--112}, year = {2022}, editor = {Dong, Bin and Li, Qianxiao and Wang, Lei and Xu, Zhi-Qin John}, volume = {190}, series = {Proceedings of Machine Learning Research}, month = {15--17 Aug}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v190/troiani22a/troiani22a.pdf}, url = {https://proceedings.mlr.press/v190/troiani22a.html}, abstract = {In this manuscript we consider denoising of large rectangular matrices: given a noisy observation of a signal matrix, what is the best way of recovering the signal matrix itself? For Gaussian noise and rotationally-invariant signal priors, we completely characterize the optimal denoiser and its performance in the high-dimensional limit, in which the size of the signal matrix goes to infinity with fixed aspects ratio, and under the Bayes optimal setting, that is when the statistician knows how the signal and the observations were generated. Our results generalise previous works that considered only symmetric matrices to the more general case of non-symmetric and rectangular ones. We explore analytically and numerically a particular choice of factorized signal prior that models cross-covariance matrices and the matrix factorization problem. As a byproduct of our analysis, we provide an explicit asymptotic evaluation of the rectangular Harish-Chandra-Itzykson-Zuber integral in a special case.} }
Endnote
%0 Conference Paper %T Optimal denoising of rotationally invariant rectangular matrices %A Emanuele Troiani %A Vittorio Erba %A FLORENT KRZAKALA %A Antoine Maillard %A Lenka Zdeborova %B Proceedings of Mathematical and Scientific Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Bin Dong %E Qianxiao Li %E Lei Wang %E Zhi-Qin John Xu %F pmlr-v190-troiani22a %I PMLR %P 97--112 %U https://proceedings.mlr.press/v190/troiani22a.html %V 190 %X In this manuscript we consider denoising of large rectangular matrices: given a noisy observation of a signal matrix, what is the best way of recovering the signal matrix itself? For Gaussian noise and rotationally-invariant signal priors, we completely characterize the optimal denoiser and its performance in the high-dimensional limit, in which the size of the signal matrix goes to infinity with fixed aspects ratio, and under the Bayes optimal setting, that is when the statistician knows how the signal and the observations were generated. Our results generalise previous works that considered only symmetric matrices to the more general case of non-symmetric and rectangular ones. We explore analytically and numerically a particular choice of factorized signal prior that models cross-covariance matrices and the matrix factorization problem. As a byproduct of our analysis, we provide an explicit asymptotic evaluation of the rectangular Harish-Chandra-Itzykson-Zuber integral in a special case.
APA
Troiani, E., Erba, V., KRZAKALA, F., Maillard, A. & Zdeborova, L.. (2022). Optimal denoising of rotationally invariant rectangular matrices. Proceedings of Mathematical and Scientific Machine Learning, in Proceedings of Machine Learning Research 190:97-112 Available from https://proceedings.mlr.press/v190/troiani22a.html.

Related Material