A Lower Bound for Linear and Kernel Regression with Adaptive Covariates

Tor Lattimore
Proceedings of Thirty Sixth Conference on Learning Theory, PMLR 195:2095-2113, 2023.

Abstract

We prove that the continuous time version of the concentration bounds by Abbasi-Yadkori et al. (2011) for adaptive linear regression cannot be improved in general, showing that there can be a significant price for sequential design. This resolves the continuous time version of the COLT open problem by Vakili et al. (2021b) on confidence intervals for kernel regression with sequential designs. Experimental evidence suggests that improved confidence bounds are also not possible in discrete time.

Cite this Paper


BibTeX
@InProceedings{pmlr-v195-lattimore23b, title = {A Lower Bound for Linear and Kernel Regression with Adaptive Covariates}, author = {Lattimore, Tor}, booktitle = {Proceedings of Thirty Sixth Conference on Learning Theory}, pages = {2095--2113}, year = {2023}, editor = {Neu, Gergely and Rosasco, Lorenzo}, volume = {195}, series = {Proceedings of Machine Learning Research}, month = {12--15 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v195/lattimore23b/lattimore23b.pdf}, url = {https://proceedings.mlr.press/v195/lattimore23b.html}, abstract = {We prove that the continuous time version of the concentration bounds by Abbasi-Yadkori et al. (2011) for adaptive linear regression cannot be improved in general, showing that there can be a significant price for sequential design. This resolves the continuous time version of the COLT open problem by Vakili et al. (2021b) on confidence intervals for kernel regression with sequential designs. Experimental evidence suggests that improved confidence bounds are also not possible in discrete time.} }
Endnote
%0 Conference Paper %T A Lower Bound for Linear and Kernel Regression with Adaptive Covariates %A Tor Lattimore %B Proceedings of Thirty Sixth Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2023 %E Gergely Neu %E Lorenzo Rosasco %F pmlr-v195-lattimore23b %I PMLR %P 2095--2113 %U https://proceedings.mlr.press/v195/lattimore23b.html %V 195 %X We prove that the continuous time version of the concentration bounds by Abbasi-Yadkori et al. (2011) for adaptive linear regression cannot be improved in general, showing that there can be a significant price for sequential design. This resolves the continuous time version of the COLT open problem by Vakili et al. (2021b) on confidence intervals for kernel regression with sequential designs. Experimental evidence suggests that improved confidence bounds are also not possible in discrete time.
APA
Lattimore, T.. (2023). A Lower Bound for Linear and Kernel Regression with Adaptive Covariates. Proceedings of Thirty Sixth Conference on Learning Theory, in Proceedings of Machine Learning Research 195:2095-2113 Available from https://proceedings.mlr.press/v195/lattimore23b.html.

Related Material