An information-theoretic lower bound in time-uniform estimation

John Duchi, Saminul Haque
Proceedings of Thirty Seventh Conference on Learning Theory, PMLR 247:1486-1500, 2024.

Abstract

We present an information-theoretic lower bound for the problem of parameter estimation with time-uniform coverage guarantees. We use a reduction to sequential testing to obtain stronger lower bounds that capture the hardness of the time-uniform setting. In the case of location model estimation and logistic regression, our lower bound is Ω(n1loglogn), which is tight up to constant factors in typical settings.

Cite this Paper


BibTeX
@InProceedings{pmlr-v247-duchi24a, title = {An information-theoretic lower bound in time-uniform estimation}, author = {Duchi, John and Haque, Saminul}, booktitle = {Proceedings of Thirty Seventh Conference on Learning Theory}, pages = {1486--1500}, year = {2024}, editor = {Agrawal, Shipra and Roth, Aaron}, volume = {247}, series = {Proceedings of Machine Learning Research}, month = {30 Jun--03 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v247/duchi24a/duchi24a.pdf}, url = {https://proceedings.mlr.press/v247/duchi24a.html}, abstract = {We present an information-theoretic lower bound for the problem of parameter estimation with time-uniform coverage guarantees. We use a reduction to sequential testing to obtain stronger lower bounds that capture the hardness of the time-uniform setting. In the case of location model estimation and logistic regression, our lower bound is $\Omega(\sqrt{n^{-1}\log \log n})$, which is tight up to constant factors in typical settings.} }
Endnote
%0 Conference Paper %T An information-theoretic lower bound in time-uniform estimation %A John Duchi %A Saminul Haque %B Proceedings of Thirty Seventh Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2024 %E Shipra Agrawal %E Aaron Roth %F pmlr-v247-duchi24a %I PMLR %P 1486--1500 %U https://proceedings.mlr.press/v247/duchi24a.html %V 247 %X We present an information-theoretic lower bound for the problem of parameter estimation with time-uniform coverage guarantees. We use a reduction to sequential testing to obtain stronger lower bounds that capture the hardness of the time-uniform setting. In the case of location model estimation and logistic regression, our lower bound is $\Omega(\sqrt{n^{-1}\log \log n})$, which is tight up to constant factors in typical settings.
APA
Duchi, J. & Haque, S.. (2024). An information-theoretic lower bound in time-uniform estimation. Proceedings of Thirty Seventh Conference on Learning Theory, in Proceedings of Machine Learning Research 247:1486-1500 Available from https://proceedings.mlr.press/v247/duchi24a.html.

Related Material