[edit]
Simple Binary Hypothesis Testing under Local Differential Privacy and Communication Constraints
Proceedings of Thirty Sixth Conference on Learning Theory, PMLR 195:3229-3230, 2023.
Abstract
We study simple binary hypothesis testing under local differential privacy (LDP) and communication constraints. Our results are either minimax optimal or instance optimal: the former hold for the set of distribution pairs with prescribed Hellinger divergence and total variation distance, whereas the latter hold for specific distribution pairs. For the sample complexity of simple hypothesis testing under pure LDP constraints, we establish instance-optimal bounds for distributions with binary support; minimax-optimal bounds for general distributions; and (approximately) instance-optimal, computationally efficient algorithms for general distributions. Under both privacy and communication constraints, we develop instance-optimal, computationally efficient algorithms that achieve minimal sample complexity (up to universal constants). Our results on instance-optimal algorithms hinge on identifying the extreme points of the joint range set of two distributions $p$ and $q$, defined as $\mathcal{A} := \{(\mathbf{T} p, \mathbf{T} q) | \mathbf{T} \in \mathcal{C}\}$, where $\mathcal{C}$ is the set of channels characterizing the constraints.