[edit]
An information-theoretic lower bound in time-uniform estimation
Proceedings of Thirty Seventh Conference on Learning Theory, PMLR 247:1486-1500, 2024.
Abstract
We present an information-theoretic lower bound for the problem of parameter estimation with time-uniform coverage guarantees. We use a reduction to sequential testing to obtain stronger lower bounds that capture the hardness of the time-uniform setting. In the case of location model estimation and logistic regression, our lower bound is $\Omega(\sqrt{n^{-1}\log \log n})$, which is tight up to constant factors in typical settings.