Noisy Computing of the Threshold Function

Ziao Wang, Nadim Ghaddar, Banghua Zhu, Lele Wang
Proceedings of The 36th International Conference on Algorithmic Learning Theory, PMLR 272:1313-1315, 2025.

Abstract

Let THk denote the k-out-of-n threshold function: given n input Boolean variables, the output is 1 if and only if at least k of the inputs are 1. We consider the problem of computing the THk function using noisy readings of the Boolean variables, where each reading is incorrect with some fixed and known probability p(0,1/2). As our main result, we show that it is sufficient to use (1+o(1))nlogmδDKL(p queries in expectation to compute the \mathsf{TH}_k function with a vanishing error probability \delta = o(1), where m\triangleq \min\{k,n-k+1\} and D_{\mathsf{KL}}(p \| 1-p) denotes the Kullback-Leibler divergence between \mathsf{Bern}(p) and \mathsf{Bern}(1-p) distributions. Conversely, we show that any algorithm achieving an error probability of \delta = o(1) necessitates at least (1-o(1))\frac{(n-m)\log\frac{m}{\delta}}{D_{\mathsf{KL}}(p \| 1-p)} queries in expectation. The upper and lower bounds are tight when m=o(n), and are within a multiplicative factor of \frac{n}{n-m} when m=\Theta(n). In particular, when k=n/2, the \mathsf{TH}_k function corresponds to the \mathsf{MAJORITY} function, in which case the upper and lower bounds are tight up to a multiplicative factor of two. Compared to previous work, our result tightens the dependence on p in both the upper and lower bounds.

Cite this Paper


BibTeX
@InProceedings{pmlr-v272-wang25a, title = {Noisy Computing of the Threshold Function}, author = {Wang, Ziao and Ghaddar, Nadim and Zhu, Banghua and Wang, Lele}, booktitle = {Proceedings of The 36th International Conference on Algorithmic Learning Theory}, pages = {1313--1315}, year = {2025}, editor = {Kamath, Gautam and Loh, Po-Ling}, volume = {272}, series = {Proceedings of Machine Learning Research}, month = {24--27 Feb}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v272/main/assets/wang25a/wang25a.pdf}, url = {https://proceedings.mlr.press/v272/wang25a.html}, abstract = {Let $\mathsf{TH}_k$ denote the $k$-out-of-$n$ threshold function: given $n$ input Boolean variables, the output is $1$ if and only if at least $k$ of the inputs are $1$. We consider the problem of computing the $\mathsf{TH}_k$ function using noisy readings of the Boolean variables, where each reading is incorrect with some fixed and known probability $p \in (0,1/2)$. As our main result, we show that it is sufficient to use $(1+o(1)) \frac{n\log \frac{m}{\delta}}{D_{\mathsf{KL}}(p \| 1-p)}$ queries in expectation to compute the $\mathsf{TH}_k$ function with a vanishing error probability $\delta = o(1)$, where $m\triangleq \min\{k,n-k+1\}$ and $D_{\mathsf{KL}}(p \| 1-p)$ denotes the Kullback-Leibler divergence between $\mathsf{Bern}(p)$ and $\mathsf{Bern}(1-p)$ distributions. Conversely, we show that any algorithm achieving an error probability of $\delta = o(1)$ necessitates at least $(1-o(1))\frac{(n-m)\log\frac{m}{\delta}}{D_{\mathsf{KL}}(p \| 1-p)}$ queries in expectation. The upper and lower bounds are tight when $m=o(n)$, and are within a multiplicative factor of $\frac{n}{n-m}$ when $m=\Theta(n)$. In particular, when $k=n/2$, the $\mathsf{TH}_k$ function corresponds to the $\mathsf{MAJORITY}$ function, in which case the upper and lower bounds are tight up to a multiplicative factor of two. Compared to previous work, our result tightens the dependence on $p$ in both the upper and lower bounds.} }
Endnote
%0 Conference Paper %T Noisy Computing of the Threshold Function %A Ziao Wang %A Nadim Ghaddar %A Banghua Zhu %A Lele Wang %B Proceedings of The 36th International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2025 %E Gautam Kamath %E Po-Ling Loh %F pmlr-v272-wang25a %I PMLR %P 1313--1315 %U https://proceedings.mlr.press/v272/wang25a.html %V 272 %X Let $\mathsf{TH}_k$ denote the $k$-out-of-$n$ threshold function: given $n$ input Boolean variables, the output is $1$ if and only if at least $k$ of the inputs are $1$. We consider the problem of computing the $\mathsf{TH}_k$ function using noisy readings of the Boolean variables, where each reading is incorrect with some fixed and known probability $p \in (0,1/2)$. As our main result, we show that it is sufficient to use $(1+o(1)) \frac{n\log \frac{m}{\delta}}{D_{\mathsf{KL}}(p \| 1-p)}$ queries in expectation to compute the $\mathsf{TH}_k$ function with a vanishing error probability $\delta = o(1)$, where $m\triangleq \min\{k,n-k+1\}$ and $D_{\mathsf{KL}}(p \| 1-p)$ denotes the Kullback-Leibler divergence between $\mathsf{Bern}(p)$ and $\mathsf{Bern}(1-p)$ distributions. Conversely, we show that any algorithm achieving an error probability of $\delta = o(1)$ necessitates at least $(1-o(1))\frac{(n-m)\log\frac{m}{\delta}}{D_{\mathsf{KL}}(p \| 1-p)}$ queries in expectation. The upper and lower bounds are tight when $m=o(n)$, and are within a multiplicative factor of $\frac{n}{n-m}$ when $m=\Theta(n)$. In particular, when $k=n/2$, the $\mathsf{TH}_k$ function corresponds to the $\mathsf{MAJORITY}$ function, in which case the upper and lower bounds are tight up to a multiplicative factor of two. Compared to previous work, our result tightens the dependence on $p$ in both the upper and lower bounds.
APA
Wang, Z., Ghaddar, N., Zhu, B. & Wang, L.. (2025). Noisy Computing of the Threshold Function. Proceedings of The 36th International Conference on Algorithmic Learning Theory, in Proceedings of Machine Learning Research 272:1313-1315 Available from https://proceedings.mlr.press/v272/wang25a.html.

Related Material