The Sample Complexity of Distributed Simple Binary Hypothesis Testing under Information Constraints

Hadi Kazemi, Ankit Pensia, Jog Varun
Proceedings of Thirty Eighth Conference on Learning Theory, PMLR 291:3213-3214, 2025.

Abstract

This paper resolves two open problems from a recent paper (Pensia et al., 2024b) concerning the sample complexity of distributed simple binary hypothesis testing under information constraints. The first open problem asks whether interaction reduces the sample complexity of distributed simple binary hypothesis testing. In this paper, we show that sequential interaction does not help. The second problem suggests tightening existing sample complexity bounds for communication-constrained simple binary hypothesis testing. We derive optimally tight bounds for this setting and resolve this problem. Our main technical contributions are: (i) a one-shot lower bound on the Bayes error in simple binary hypothesis testing that satisfies a crucial tensorisation property; (ii) a streamlined proof of the formula for the sample complexity of simple binary hypothesis testing without constraints, first established in (Pensia et al., 2024b); and (iii) a reverse data-processing inequality for Hellinger-$\lambda$ divergences, generalising the results from Bhatt et al.(2021) and Pensia et al. (2023).

Cite this Paper


BibTeX
@InProceedings{pmlr-v291-kazemi25a, title = {The Sample Complexity of Distributed Simple Binary Hypothesis Testing under Information Constraints}, author = {Kazemi, Hadi and Pensia, Ankit and Varun, Jog}, booktitle = {Proceedings of Thirty Eighth Conference on Learning Theory}, pages = {3213--3214}, year = {2025}, editor = {Haghtalab, Nika and Moitra, Ankur}, volume = {291}, series = {Proceedings of Machine Learning Research}, month = {30 Jun--04 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v291/main/assets/kazemi25a/kazemi25a.pdf}, url = {https://proceedings.mlr.press/v291/kazemi25a.html}, abstract = {This paper resolves two open problems from a recent paper (Pensia et al., 2024b) concerning the sample complexity of distributed simple binary hypothesis testing under information constraints. The first open problem asks whether interaction reduces the sample complexity of distributed simple binary hypothesis testing. In this paper, we show that sequential interaction does not help. The second problem suggests tightening existing sample complexity bounds for communication-constrained simple binary hypothesis testing. We derive optimally tight bounds for this setting and resolve this problem. Our main technical contributions are: (i) a one-shot lower bound on the Bayes error in simple binary hypothesis testing that satisfies a crucial tensorisation property; (ii) a streamlined proof of the formula for the sample complexity of simple binary hypothesis testing without constraints, first established in (Pensia et al., 2024b); and (iii) a reverse data-processing inequality for Hellinger-$\lambda$ divergences, generalising the results from Bhatt et al.(2021) and Pensia et al. (2023).} }
Endnote
%0 Conference Paper %T The Sample Complexity of Distributed Simple Binary Hypothesis Testing under Information Constraints %A Hadi Kazemi %A Ankit Pensia %A Jog Varun %B Proceedings of Thirty Eighth Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2025 %E Nika Haghtalab %E Ankur Moitra %F pmlr-v291-kazemi25a %I PMLR %P 3213--3214 %U https://proceedings.mlr.press/v291/kazemi25a.html %V 291 %X This paper resolves two open problems from a recent paper (Pensia et al., 2024b) concerning the sample complexity of distributed simple binary hypothesis testing under information constraints. The first open problem asks whether interaction reduces the sample complexity of distributed simple binary hypothesis testing. In this paper, we show that sequential interaction does not help. The second problem suggests tightening existing sample complexity bounds for communication-constrained simple binary hypothesis testing. We derive optimally tight bounds for this setting and resolve this problem. Our main technical contributions are: (i) a one-shot lower bound on the Bayes error in simple binary hypothesis testing that satisfies a crucial tensorisation property; (ii) a streamlined proof of the formula for the sample complexity of simple binary hypothesis testing without constraints, first established in (Pensia et al., 2024b); and (iii) a reverse data-processing inequality for Hellinger-$\lambda$ divergences, generalising the results from Bhatt et al.(2021) and Pensia et al. (2023).
APA
Kazemi, H., Pensia, A. & Varun, J.. (2025). The Sample Complexity of Distributed Simple Binary Hypothesis Testing under Information Constraints. Proceedings of Thirty Eighth Conference on Learning Theory, in Proceedings of Machine Learning Research 291:3213-3214 Available from https://proceedings.mlr.press/v291/kazemi25a.html.

Related Material