Estimation in Rotationally Invariant Generalized Linear Models via Approximate Message Passing

Ramji Venkataramanan, Kevin Kögler, Marco Mondelli
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:22120-22144, 2022.

Abstract

We consider the problem of signal estimation in generalized linear models defined via rotationally invariant design matrices. Since these matrices can have an arbitrary spectral distribution, this model is well suited for capturing complex correlation structures which often arise in applications. We propose a novel family of approximate message passing (AMP) algorithms for signal estimation, and rigorously characterize their performance in the high-dimensional limit via a state evolution recursion. Our rotationally invariant AMP has complexity of the same order as the existing AMP derived under the restrictive assumption of a Gaussian design; our algorithm also recovers this existing AMP as a special case. Numerical results showcase a performance close to Vector AMP (which is conjectured to be Bayes-optimal in some settings), but obtained with a much lower complexity, as the proposed algorithm does not require a computationally expensive singular value decomposition.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-venkataramanan22a, title = {Estimation in Rotationally Invariant Generalized Linear Models via Approximate Message Passing}, author = {Venkataramanan, Ramji and K{\"o}gler, Kevin and Mondelli, Marco}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {22120--22144}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/venkataramanan22a/venkataramanan22a.pdf}, url = {https://proceedings.mlr.press/v162/venkataramanan22a.html}, abstract = {We consider the problem of signal estimation in generalized linear models defined via rotationally invariant design matrices. Since these matrices can have an arbitrary spectral distribution, this model is well suited for capturing complex correlation structures which often arise in applications. We propose a novel family of approximate message passing (AMP) algorithms for signal estimation, and rigorously characterize their performance in the high-dimensional limit via a state evolution recursion. Our rotationally invariant AMP has complexity of the same order as the existing AMP derived under the restrictive assumption of a Gaussian design; our algorithm also recovers this existing AMP as a special case. Numerical results showcase a performance close to Vector AMP (which is conjectured to be Bayes-optimal in some settings), but obtained with a much lower complexity, as the proposed algorithm does not require a computationally expensive singular value decomposition.} }
Endnote
%0 Conference Paper %T Estimation in Rotationally Invariant Generalized Linear Models via Approximate Message Passing %A Ramji Venkataramanan %A Kevin Kögler %A Marco Mondelli %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-venkataramanan22a %I PMLR %P 22120--22144 %U https://proceedings.mlr.press/v162/venkataramanan22a.html %V 162 %X We consider the problem of signal estimation in generalized linear models defined via rotationally invariant design matrices. Since these matrices can have an arbitrary spectral distribution, this model is well suited for capturing complex correlation structures which often arise in applications. We propose a novel family of approximate message passing (AMP) algorithms for signal estimation, and rigorously characterize their performance in the high-dimensional limit via a state evolution recursion. Our rotationally invariant AMP has complexity of the same order as the existing AMP derived under the restrictive assumption of a Gaussian design; our algorithm also recovers this existing AMP as a special case. Numerical results showcase a performance close to Vector AMP (which is conjectured to be Bayes-optimal in some settings), but obtained with a much lower complexity, as the proposed algorithm does not require a computationally expensive singular value decomposition.
APA
Venkataramanan, R., Kögler, K. & Mondelli, M.. (2022). Estimation in Rotationally Invariant Generalized Linear Models via Approximate Message Passing. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:22120-22144 Available from https://proceedings.mlr.press/v162/venkataramanan22a.html.

Related Material