[edit]
A Lower Bound for Linear and Kernel Regression with Adaptive Covariates
Proceedings of Thirty Sixth Conference on Learning Theory, PMLR 195:2095-2113, 2023.
Abstract
We prove that the continuous time version of the concentration bounds by Abbasi-Yadkori et al. (2011) for adaptive linear regression cannot be improved in general, showing that there can be a significant price for sequential design. This resolves the continuous time version of the COLT open problem by Vakili et al. (2021b) on confidence intervals for kernel regression with sequential designs. Experimental evidence suggests that improved confidence bounds are also not possible in discrete time.