Learning Without Mixing: Towards A Sharp Analysis of Linear System Identification

Max Simchowitz, Horia Mania, Stephen Tu, Michael I. Jordan, Benjamin Recht
Proceedings of the 31st Conference On Learning Theory, PMLR 75:439-473, 2018.

Abstract

We prove that the ordinary least-squares (OLS) estimator attains nearly minimax optimal performance for the identification of linear dynamical systems from a single observed trajectory. Our upper bound relies on a generalization of Mendelson’s small-ball method to dependent data, eschewing the use of standard mixing-time arguments. Our lower bounds reveal that these upper bounds match up to logarithmic factors. In particular, we capture the correct signal-to-noise behavior of the problem, showing that \emph{more unstable} linear systems are \emph{easier} to estimate. This behavior is qualitatively different from arguments which rely on mixing-time calculations that suggest that unstable systems are more difficult to estimate. We generalize our technique to provide bounds for a more general class of linear response time-series.

Cite this Paper


BibTeX
@InProceedings{pmlr-v75-simchowitz18a, title = {Learning Without Mixing: Towards A Sharp Analysis of Linear System Identification}, author = {Simchowitz, Max and Mania, Horia and Tu, Stephen and Jordan, Michael I. and Recht, Benjamin}, booktitle = {Proceedings of the 31st Conference On Learning Theory}, pages = {439--473}, year = {2018}, editor = {Bubeck, Sébastien and Perchet, Vianney and Rigollet, Philippe}, volume = {75}, series = {Proceedings of Machine Learning Research}, month = {06--09 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v75/simchowitz18a/simchowitz18a.pdf}, url = {https://proceedings.mlr.press/v75/simchowitz18a.html}, abstract = {We prove that the ordinary least-squares (OLS) estimator attains nearly minimax optimal performance for the identification of linear dynamical systems from a single observed trajectory. Our upper bound relies on a generalization of Mendelson’s small-ball method to dependent data, eschewing the use of standard mixing-time arguments. Our lower bounds reveal that these upper bounds match up to logarithmic factors. In particular, we capture the correct signal-to-noise behavior of the problem, showing that \emph{more unstable} linear systems are \emph{easier} to estimate. This behavior is qualitatively different from arguments which rely on mixing-time calculations that suggest that unstable systems are more difficult to estimate. We generalize our technique to provide bounds for a more general class of linear response time-series. } }
Endnote
%0 Conference Paper %T Learning Without Mixing: Towards A Sharp Analysis of Linear System Identification %A Max Simchowitz %A Horia Mania %A Stephen Tu %A Michael I. Jordan %A Benjamin Recht %B Proceedings of the 31st Conference On Learning Theory %C Proceedings of Machine Learning Research %D 2018 %E Sébastien Bubeck %E Vianney Perchet %E Philippe Rigollet %F pmlr-v75-simchowitz18a %I PMLR %P 439--473 %U https://proceedings.mlr.press/v75/simchowitz18a.html %V 75 %X We prove that the ordinary least-squares (OLS) estimator attains nearly minimax optimal performance for the identification of linear dynamical systems from a single observed trajectory. Our upper bound relies on a generalization of Mendelson’s small-ball method to dependent data, eschewing the use of standard mixing-time arguments. Our lower bounds reveal that these upper bounds match up to logarithmic factors. In particular, we capture the correct signal-to-noise behavior of the problem, showing that \emph{more unstable} linear systems are \emph{easier} to estimate. This behavior is qualitatively different from arguments which rely on mixing-time calculations that suggest that unstable systems are more difficult to estimate. We generalize our technique to provide bounds for a more general class of linear response time-series.
APA
Simchowitz, M., Mania, H., Tu, S., Jordan, M.I. & Recht, B.. (2018). Learning Without Mixing: Towards A Sharp Analysis of Linear System Identification. Proceedings of the 31st Conference On Learning Theory, in Proceedings of Machine Learning Research 75:439-473 Available from https://proceedings.mlr.press/v75/simchowitz18a.html.

Related Material