[edit]
Noisy Computing of the Threshold Function
Proceedings of The 36th International Conference on Algorithmic Learning Theory, PMLR 272:1313-1315, 2025.
Abstract
Let THk denote the k-out-of-n threshold function: given n input Boolean variables, the output is 1 if and only if at least k of the inputs are 1. We consider the problem of computing the THk function using noisy readings of the Boolean variables, where each reading is incorrect with some fixed and known probability p∈(0,1/2). As our main result, we show that it is sufficient to use (1+o(1))nlogmδDKL(p‖ queries in expectation to compute the \mathsf{TH}_k function with a vanishing error probability \delta = o(1), where m\triangleq \min\{k,n-k+1\} and D_{\mathsf{KL}}(p \| 1-p) denotes the Kullback-Leibler divergence between \mathsf{Bern}(p) and \mathsf{Bern}(1-p) distributions. Conversely, we show that any algorithm achieving an error probability of \delta = o(1) necessitates at least (1-o(1))\frac{(n-m)\log\frac{m}{\delta}}{D_{\mathsf{KL}}(p \| 1-p)} queries in expectation. The upper and lower bounds are tight when m=o(n), and are within a multiplicative factor of \frac{n}{n-m} when m=\Theta(n). In particular, when k=n/2, the \mathsf{TH}_k function corresponds to the \mathsf{MAJORITY} function, in which case the upper and lower bounds are tight up to a multiplicative factor of two. Compared to previous work, our result tightens the dependence on p in both the upper and lower bounds.