Recovery Guarantees for Distributed-OMP

Chen Amiraz, Robert Krauthgamer, Boaz Nadler
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:802-810, 2024.

Abstract

We study distributed schemes for high-dimensional sparse linear regression, based on orthogonal matching pursuit (OMP). Such schemes are particularly suited for settings where a central fusion center is connected to end machines, that have both computation and communication limitations. We prove that under suitable assumptions, distributed-OMP schemes recover the support of the regression vector with communication per machine linear in its sparsity and logarithmic in the dimension. Remarkably, this holds even at low signal-to-noise-ratios, where individual machines are unable to detect the support. Our simulations show that distributed-OMP schemes are competitive with more computationally intensive methods, and in some cases even outperform them.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-amiraz24a, title = { Recovery Guarantees for Distributed-{OMP} }, author = {Amiraz, Chen and Krauthgamer, Robert and Nadler, Boaz}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {802--810}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/amiraz24a/amiraz24a.pdf}, url = {https://proceedings.mlr.press/v238/amiraz24a.html}, abstract = { We study distributed schemes for high-dimensional sparse linear regression, based on orthogonal matching pursuit (OMP). Such schemes are particularly suited for settings where a central fusion center is connected to end machines, that have both computation and communication limitations. We prove that under suitable assumptions, distributed-OMP schemes recover the support of the regression vector with communication per machine linear in its sparsity and logarithmic in the dimension. Remarkably, this holds even at low signal-to-noise-ratios, where individual machines are unable to detect the support. Our simulations show that distributed-OMP schemes are competitive with more computationally intensive methods, and in some cases even outperform them. } }
Endnote
%0 Conference Paper %T Recovery Guarantees for Distributed-OMP %A Chen Amiraz %A Robert Krauthgamer %A Boaz Nadler %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-amiraz24a %I PMLR %P 802--810 %U https://proceedings.mlr.press/v238/amiraz24a.html %V 238 %X We study distributed schemes for high-dimensional sparse linear regression, based on orthogonal matching pursuit (OMP). Such schemes are particularly suited for settings where a central fusion center is connected to end machines, that have both computation and communication limitations. We prove that under suitable assumptions, distributed-OMP schemes recover the support of the regression vector with communication per machine linear in its sparsity and logarithmic in the dimension. Remarkably, this holds even at low signal-to-noise-ratios, where individual machines are unable to detect the support. Our simulations show that distributed-OMP schemes are competitive with more computationally intensive methods, and in some cases even outperform them.
APA
Amiraz, C., Krauthgamer, R. & Nadler, B.. (2024). Recovery Guarantees for Distributed-OMP . Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:802-810 Available from https://proceedings.mlr.press/v238/amiraz24a.html.

Related Material