Finite Sample System Identification: Optimal Rates and the Role of Regularization

Yue Sun, Samet Oymak, Maryam Fazel
Proceedings of the 2nd Conference on Learning for Dynamics and Control, PMLR 120:16-25, 2020.

Abstract

This paper studies the optimality of regularized regression for low order linear system identification. The nuclear norm of the system’s Hankel matrix is added as a regularizer to the least squares cost function due to the following advantages: (1) its easy to tune regularzation weight, (2) lower sample complexity, (3) returning a Hankel matrix with a clear singular value gap, which robustly recovers a low-order linear system from noisy output observations. Recently, the performance of unregularized least squares formulations have been studied statistically in terms of finite sample complexity and recovery error; however, no results are known for the regularized approach. In this work, we show that with the advantage of sample complexity kept, the regularized algorithm beats unregularized least squares in Hankel spectral norm bound.

Cite this Paper


BibTeX
@InProceedings{pmlr-v120-sun20a, title = {Finite Sample System Identification: Optimal Rates and the Role of Regularization}, author = {Sun, Yue and Oymak, Samet and Fazel, Maryam}, booktitle = {Proceedings of the 2nd Conference on Learning for Dynamics and Control}, pages = {16--25}, year = {2020}, editor = {Bayen, Alexandre M. and Jadbabaie, Ali and Pappas, George and Parrilo, Pablo A. and Recht, Benjamin and Tomlin, Claire and Zeilinger, Melanie}, volume = {120}, series = {Proceedings of Machine Learning Research}, month = {10--11 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v120/sun20a/sun20a.pdf}, url = {https://proceedings.mlr.press/v120/sun20a.html}, abstract = {This paper studies the optimality of regularized regression for low order linear system identification. The nuclear norm of the system’s Hankel matrix is added as a regularizer to the least squares cost function due to the following advantages: (1) its easy to tune regularzation weight, (2) lower sample complexity, (3) returning a Hankel matrix with a clear singular value gap, which robustly recovers a low-order linear system from noisy output observations. Recently, the performance of unregularized least squares formulations have been studied statistically in terms of finite sample complexity and recovery error; however, no results are known for the regularized approach. In this work, we show that with the advantage of sample complexity kept, the regularized algorithm beats unregularized least squares in Hankel spectral norm bound.} }
Endnote
%0 Conference Paper %T Finite Sample System Identification: Optimal Rates and the Role of Regularization %A Yue Sun %A Samet Oymak %A Maryam Fazel %B Proceedings of the 2nd Conference on Learning for Dynamics and Control %C Proceedings of Machine Learning Research %D 2020 %E Alexandre M. Bayen %E Ali Jadbabaie %E George Pappas %E Pablo A. Parrilo %E Benjamin Recht %E Claire Tomlin %E Melanie Zeilinger %F pmlr-v120-sun20a %I PMLR %P 16--25 %U https://proceedings.mlr.press/v120/sun20a.html %V 120 %X This paper studies the optimality of regularized regression for low order linear system identification. The nuclear norm of the system’s Hankel matrix is added as a regularizer to the least squares cost function due to the following advantages: (1) its easy to tune regularzation weight, (2) lower sample complexity, (3) returning a Hankel matrix with a clear singular value gap, which robustly recovers a low-order linear system from noisy output observations. Recently, the performance of unregularized least squares formulations have been studied statistically in terms of finite sample complexity and recovery error; however, no results are known for the regularized approach. In this work, we show that with the advantage of sample complexity kept, the regularized algorithm beats unregularized least squares in Hankel spectral norm bound.
APA
Sun, Y., Oymak, S. & Fazel, M.. (2020). Finite Sample System Identification: Optimal Rates and the Role of Regularization. Proceedings of the 2nd Conference on Learning for Dynamics and Control, in Proceedings of Machine Learning Research 120:16-25 Available from https://proceedings.mlr.press/v120/sun20a.html.

Related Material