Training Lipschitz Continuous Operators Using Reproducing Kernels

Henk van Waarde, Rodolphe Sepulchre
Proceedings of The 4th Annual Learning for Dynamics and Control Conference, PMLR 168:221-233, 2022.

Abstract

This paper proposes that Lipschitz continuity is a natural outcome of regularized least squares in kernel-based learning. Lipschitz continuity is an important proxy for robustness of input-output operators. It is also instrumental for guaranteeing closed-loop stability of kernel-based controlllers through small incremental gain arguments. We introduce a new class of nonexpansive kernels that are shown to induce Hilbert spaces consisting of only Lipschitz continuous operators. The Lipschitz constant of estimated operators within such Hilbert spaces can be tuned by suitable selection of a regularization parameter. As is typical for kernel-based models, input-output operators are estimated from data by solving tractable systems of linear equations. The approach thus constitutes a promising alternative to Lipschitz-bounded neural networks, that have recently been investigated but are computationally expensive to train.

Cite this Paper


BibTeX
@InProceedings{pmlr-v168-waarde22a, title = {Training Lipschitz Continuous Operators Using Reproducing Kernels}, author = {Waarde, Henk van and Sepulchre, Rodolphe}, booktitle = {Proceedings of The 4th Annual Learning for Dynamics and Control Conference}, pages = {221--233}, year = {2022}, editor = {Firoozi, Roya and Mehr, Negar and Yel, Esen and Antonova, Rika and Bohg, Jeannette and Schwager, Mac and Kochenderfer, Mykel}, volume = {168}, series = {Proceedings of Machine Learning Research}, month = {23--24 Jun}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v168/waarde22a/waarde22a.pdf}, url = {https://proceedings.mlr.press/v168/waarde22a.html}, abstract = {This paper proposes that Lipschitz continuity is a natural outcome of regularized least squares in kernel-based learning. Lipschitz continuity is an important proxy for robustness of input-output operators. It is also instrumental for guaranteeing closed-loop stability of kernel-based controlllers through small incremental gain arguments. We introduce a new class of nonexpansive kernels that are shown to induce Hilbert spaces consisting of only Lipschitz continuous operators. The Lipschitz constant of estimated operators within such Hilbert spaces can be tuned by suitable selection of a regularization parameter. As is typical for kernel-based models, input-output operators are estimated from data by solving tractable systems of linear equations. The approach thus constitutes a promising alternative to Lipschitz-bounded neural networks, that have recently been investigated but are computationally expensive to train.} }
Endnote
%0 Conference Paper %T Training Lipschitz Continuous Operators Using Reproducing Kernels %A Henk van Waarde %A Rodolphe Sepulchre %B Proceedings of The 4th Annual Learning for Dynamics and Control Conference %C Proceedings of Machine Learning Research %D 2022 %E Roya Firoozi %E Negar Mehr %E Esen Yel %E Rika Antonova %E Jeannette Bohg %E Mac Schwager %E Mykel Kochenderfer %F pmlr-v168-waarde22a %I PMLR %P 221--233 %U https://proceedings.mlr.press/v168/waarde22a.html %V 168 %X This paper proposes that Lipschitz continuity is a natural outcome of regularized least squares in kernel-based learning. Lipschitz continuity is an important proxy for robustness of input-output operators. It is also instrumental for guaranteeing closed-loop stability of kernel-based controlllers through small incremental gain arguments. We introduce a new class of nonexpansive kernels that are shown to induce Hilbert spaces consisting of only Lipschitz continuous operators. The Lipschitz constant of estimated operators within such Hilbert spaces can be tuned by suitable selection of a regularization parameter. As is typical for kernel-based models, input-output operators are estimated from data by solving tractable systems of linear equations. The approach thus constitutes a promising alternative to Lipschitz-bounded neural networks, that have recently been investigated but are computationally expensive to train.
APA
Waarde, H.v. & Sepulchre, R.. (2022). Training Lipschitz Continuous Operators Using Reproducing Kernels. Proceedings of The 4th Annual Learning for Dynamics and Control Conference, in Proceedings of Machine Learning Research 168:221-233 Available from https://proceedings.mlr.press/v168/waarde22a.html.

Related Material