[edit]
Single Trajectory Nonparametric Learning of Nonlinear Dynamics
Proceedings of Thirty Fifth Conference on Learning Theory, PMLR 178:3333-3364, 2022.
Abstract
Given a single trajectory of a dynamical system, we analyze the performance of the nonparametric least squares estimator (LSE). More precisely, we give nonasymptotic expected l2-distance bounds between the LSE and the true regression function, where expectation is evaluated on a fresh, counterfactual, trajectory. We leverage recently developed information-theoretic methods to establish the optimality of the LSE for nonparametric hypotheses classes in terms of supremum norm metric entropy and a subgaussian parameter. Next, we relate this subgaussian parameter to the stability of the underlying process using notions from dynamical systems theory. When combined, these developments lead to rate-optimal error bounds that scale as T−1/(2+q) for suitably stable processes and hypothesis classes with metric entropy growth of order δ−q. Here, T is the length of the observed trajectory, δ∈R+ is the packing granularity and q∈(0,2) is a complexity term. Finally, we specialize our results to a number of scenarios of practical interest, such as Lipschitz dynamics, generalized linear models, and dynamics described by functions in certain classes of Reproducing Kernel Hilbert Spaces (RKHS).