Simple Binary Hypothesis Testing under Local Differential Privacy and Communication Constraints

Ankit Pensia, Amir Reza Asadi, Varun Jog, Po-Ling Loh
Proceedings of Thirty Sixth Conference on Learning Theory, PMLR 195:3229-3230, 2023.

Abstract

We study simple binary hypothesis testing under local differential privacy (LDP) and communication constraints. Our results are either minimax optimal or instance optimal: the former hold for the set of distribution pairs with prescribed Hellinger divergence and total variation distance, whereas the latter hold for specific distribution pairs. For the sample complexity of simple hypothesis testing under pure LDP constraints, we establish instance-optimal bounds for distributions with binary support; minimax-optimal bounds for general distributions; and (approximately) instance-optimal, computationally efficient algorithms for general distributions. Under both privacy and communication constraints, we develop instance-optimal, computationally efficient algorithms that achieve minimal sample complexity (up to universal constants). Our results on instance-optimal algorithms hinge on identifying the extreme points of the joint range set of two distributions $p$ and $q$, defined as $\mathcal{A} := \{(\mathbf{T} p, \mathbf{T} q) | \mathbf{T} \in \mathcal{C}\}$, where $\mathcal{C}$ is the set of channels characterizing the constraints.

Cite this Paper


BibTeX
@InProceedings{pmlr-v195-pensia23a, title = {Simple Binary Hypothesis Testing under Local Differential Privacy and Communication Constraints}, author = {Pensia, Ankit and Asadi, Amir Reza and Jog, Varun and Loh, Po-Ling}, booktitle = {Proceedings of Thirty Sixth Conference on Learning Theory}, pages = {3229--3230}, year = {2023}, editor = {Neu, Gergely and Rosasco, Lorenzo}, volume = {195}, series = {Proceedings of Machine Learning Research}, month = {12--15 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v195/pensia23a/pensia23a.pdf}, url = {https://proceedings.mlr.press/v195/pensia23a.html}, abstract = {We study simple binary hypothesis testing under local differential privacy (LDP) and communication constraints. Our results are either minimax optimal or instance optimal: the former hold for the set of distribution pairs with prescribed Hellinger divergence and total variation distance, whereas the latter hold for specific distribution pairs. For the sample complexity of simple hypothesis testing under pure LDP constraints, we establish instance-optimal bounds for distributions with binary support; minimax-optimal bounds for general distributions; and (approximately) instance-optimal, computationally efficient algorithms for general distributions. Under both privacy and communication constraints, we develop instance-optimal, computationally efficient algorithms that achieve minimal sample complexity (up to universal constants). Our results on instance-optimal algorithms hinge on identifying the extreme points of the joint range set of two distributions $p$ and $q$, defined as $\mathcal{A} := \{(\mathbf{T} p, \mathbf{T} q) | \mathbf{T} \in \mathcal{C}\}$, where $\mathcal{C}$ is the set of channels characterizing the constraints.} }
Endnote
%0 Conference Paper %T Simple Binary Hypothesis Testing under Local Differential Privacy and Communication Constraints %A Ankit Pensia %A Amir Reza Asadi %A Varun Jog %A Po-Ling Loh %B Proceedings of Thirty Sixth Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2023 %E Gergely Neu %E Lorenzo Rosasco %F pmlr-v195-pensia23a %I PMLR %P 3229--3230 %U https://proceedings.mlr.press/v195/pensia23a.html %V 195 %X We study simple binary hypothesis testing under local differential privacy (LDP) and communication constraints. Our results are either minimax optimal or instance optimal: the former hold for the set of distribution pairs with prescribed Hellinger divergence and total variation distance, whereas the latter hold for specific distribution pairs. For the sample complexity of simple hypothesis testing under pure LDP constraints, we establish instance-optimal bounds for distributions with binary support; minimax-optimal bounds for general distributions; and (approximately) instance-optimal, computationally efficient algorithms for general distributions. Under both privacy and communication constraints, we develop instance-optimal, computationally efficient algorithms that achieve minimal sample complexity (up to universal constants). Our results on instance-optimal algorithms hinge on identifying the extreme points of the joint range set of two distributions $p$ and $q$, defined as $\mathcal{A} := \{(\mathbf{T} p, \mathbf{T} q) | \mathbf{T} \in \mathcal{C}\}$, where $\mathcal{C}$ is the set of channels characterizing the constraints.
APA
Pensia, A., Asadi, A.R., Jog, V. & Loh, P.. (2023). Simple Binary Hypothesis Testing under Local Differential Privacy and Communication Constraints. Proceedings of Thirty Sixth Conference on Learning Theory, in Proceedings of Machine Learning Research 195:3229-3230 Available from https://proceedings.mlr.press/v195/pensia23a.html.

Related Material