[edit]
Finite Sample System Identification: Optimal Rates and the Role of Regularization
Proceedings of the 2nd Conference on Learning for Dynamics and Control, PMLR 120:16-25, 2020.
Abstract
This paper studies the optimality of regularized regression for low order linear system identification. The nuclear norm of the system’s Hankel matrix is added as a regularizer to the least squares cost function due to the following advantages: (1) its easy to tune regularzation weight, (2) lower sample complexity, (3) returning a Hankel matrix with a clear singular value gap, which robustly recovers a low-order linear system from noisy output observations. Recently, the performance of unregularized least squares formulations have been studied statistically in terms of finite sample complexity and recovery error; however, no results are known for the regularized approach. In this work, we show that with the advantage of sample complexity kept, the regularized algorithm beats unregularized least squares in Hankel spectral norm bound.