Learning Linear Models Using Distributed Iterative Hessian Sketching

Han Wang, James Anderson
Proceedings of The 4th Annual Learning for Dynamics and Control Conference, PMLR 168:427-440, 2022.

Abstract

This work considers the problem of learning the Markov parameters of a linear system from observed data. Recent non-asymptotic system identification results have characterized the sample complexity of this problem in the single and multi-rollout setting. In both instances, the number of samples required in order to obtain acceptable estimates can produce optimization problems with an intractably large number of decision variables for a second-order algorithm. We show that a randomized and distributed Newton algorithm based on Hessian-sketching can produce $\epsilon$-optimal solutions and converges geometrically. Moreover, the algorithm is trivially parallelizable. Our results hold for a variety of sketching matrices and we illustrate the theory with numerical examples.

Cite this Paper


BibTeX
@InProceedings{pmlr-v168-wang22b, title = {Learning Linear Models Using Distributed Iterative Hessian Sketching}, author = {Wang, Han and Anderson, James}, booktitle = {Proceedings of The 4th Annual Learning for Dynamics and Control Conference}, pages = {427--440}, year = {2022}, editor = {Firoozi, Roya and Mehr, Negar and Yel, Esen and Antonova, Rika and Bohg, Jeannette and Schwager, Mac and Kochenderfer, Mykel}, volume = {168}, series = {Proceedings of Machine Learning Research}, month = {23--24 Jun}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v168/wang22b/wang22b.pdf}, url = {https://proceedings.mlr.press/v168/wang22b.html}, abstract = {This work considers the problem of learning the Markov parameters of a linear system from observed data. Recent non-asymptotic system identification results have characterized the sample complexity of this problem in the single and multi-rollout setting. In both instances, the number of samples required in order to obtain acceptable estimates can produce optimization problems with an intractably large number of decision variables for a second-order algorithm. We show that a randomized and distributed Newton algorithm based on Hessian-sketching can produce $\epsilon$-optimal solutions and converges geometrically. Moreover, the algorithm is trivially parallelizable. Our results hold for a variety of sketching matrices and we illustrate the theory with numerical examples.} }
Endnote
%0 Conference Paper %T Learning Linear Models Using Distributed Iterative Hessian Sketching %A Han Wang %A James Anderson %B Proceedings of The 4th Annual Learning for Dynamics and Control Conference %C Proceedings of Machine Learning Research %D 2022 %E Roya Firoozi %E Negar Mehr %E Esen Yel %E Rika Antonova %E Jeannette Bohg %E Mac Schwager %E Mykel Kochenderfer %F pmlr-v168-wang22b %I PMLR %P 427--440 %U https://proceedings.mlr.press/v168/wang22b.html %V 168 %X This work considers the problem of learning the Markov parameters of a linear system from observed data. Recent non-asymptotic system identification results have characterized the sample complexity of this problem in the single and multi-rollout setting. In both instances, the number of samples required in order to obtain acceptable estimates can produce optimization problems with an intractably large number of decision variables for a second-order algorithm. We show that a randomized and distributed Newton algorithm based on Hessian-sketching can produce $\epsilon$-optimal solutions and converges geometrically. Moreover, the algorithm is trivially parallelizable. Our results hold for a variety of sketching matrices and we illustrate the theory with numerical examples.
APA
Wang, H. & Anderson, J.. (2022). Learning Linear Models Using Distributed Iterative Hessian Sketching. Proceedings of The 4th Annual Learning for Dynamics and Control Conference, in Proceedings of Machine Learning Research 168:427-440 Available from https://proceedings.mlr.press/v168/wang22b.html.

Related Material