Fundamental Limits of Distributed Covariance Matrix Estimation Under Communication Constraints

Mohammad Reza Rahmani, Mohammad Hossein Yassaee, Mohammad Ali Maddah-Ali, Mohammad Reza Aref
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:41927-41958, 2024.

Abstract

Estimating high-dimensional covariance matrices is crucial in various domains. This work considers a scenario where two collaborating agents access disjoint dimensions of $m$ samples from a high–dimensional random vector, and they can only communicate a limited number of bits to a central server, which wants to accurately approximate the covariance matrix. We analyze the fundamental trade–off between communication cost, number of samples, and estimation accuracy. We prove a lower bound on the error achievable by any estimator, highlighting the impact of dimensions, number of samples, and communication budget. Furthermore, we present an algorithm that achieves this lower bound up to a logarithmic factor, demonstrating its near-optimality in practical settings.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-rahmani24a, title = {Fundamental Limits of Distributed Covariance Matrix Estimation Under Communication Constraints}, author = {Rahmani, Mohammad Reza and Yassaee, Mohammad Hossein and Maddah-Ali, Mohammad Ali and Aref, Mohammad Reza}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {41927--41958}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/rahmani24a/rahmani24a.pdf}, url = {https://proceedings.mlr.press/v235/rahmani24a.html}, abstract = {Estimating high-dimensional covariance matrices is crucial in various domains. This work considers a scenario where two collaborating agents access disjoint dimensions of $m$ samples from a high–dimensional random vector, and they can only communicate a limited number of bits to a central server, which wants to accurately approximate the covariance matrix. We analyze the fundamental trade–off between communication cost, number of samples, and estimation accuracy. We prove a lower bound on the error achievable by any estimator, highlighting the impact of dimensions, number of samples, and communication budget. Furthermore, we present an algorithm that achieves this lower bound up to a logarithmic factor, demonstrating its near-optimality in practical settings.} }
Endnote
%0 Conference Paper %T Fundamental Limits of Distributed Covariance Matrix Estimation Under Communication Constraints %A Mohammad Reza Rahmani %A Mohammad Hossein Yassaee %A Mohammad Ali Maddah-Ali %A Mohammad Reza Aref %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-rahmani24a %I PMLR %P 41927--41958 %U https://proceedings.mlr.press/v235/rahmani24a.html %V 235 %X Estimating high-dimensional covariance matrices is crucial in various domains. This work considers a scenario where two collaborating agents access disjoint dimensions of $m$ samples from a high–dimensional random vector, and they can only communicate a limited number of bits to a central server, which wants to accurately approximate the covariance matrix. We analyze the fundamental trade–off between communication cost, number of samples, and estimation accuracy. We prove a lower bound on the error achievable by any estimator, highlighting the impact of dimensions, number of samples, and communication budget. Furthermore, we present an algorithm that achieves this lower bound up to a logarithmic factor, demonstrating its near-optimality in practical settings.
APA
Rahmani, M.R., Yassaee, M.H., Maddah-Ali, M.A. & Aref, M.R.. (2024). Fundamental Limits of Distributed Covariance Matrix Estimation Under Communication Constraints. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:41927-41958 Available from https://proceedings.mlr.press/v235/rahmani24a.html.

Related Material