Guaranteeing Safety of Learned Perception Modules via Measurement-Robust Control Barrier Functions

Sarah Dean, Andrew Taylor, Ryan Cosner, Benjamin Recht, Aaron Ames
Proceedings of the 2020 Conference on Robot Learning, PMLR 155:654-670, 2021.

Abstract

Modern nonlinear control theory seeks to develop feedback controllers that endow systems with properties such as safety and stability. The guarantees ensured by these controllers often rely on accurate estimates of the system state for determining control actions. In practice, measurement model uncertainty can lead to error in state estimates that degrades these guarantees. In this paper, we seek to unify techniques from control theory and machine learning to synthesize controllers that achieve safety in the presence of measurement model uncertainty. We define the notion of a Measurement-Robust Control Barrier Function (MR-CBF) as a tool for determining safe control inputs when facing measurement model uncertainty. Furthermore, MR-CBFs are used to inform sampling methodologies for learning-based perception systems and quantify tolerable error in the resulting learned models. We demonstrate the efficacy of MR-CBFs in achieving safety with measurement model uncertainty on a simulated Segway system.

Cite this Paper


BibTeX
@InProceedings{pmlr-v155-dean21a, title = {Guaranteeing Safety of Learned Perception Modules via Measurement-Robust Control Barrier Functions}, author = {Dean, Sarah and Taylor, Andrew and Cosner, Ryan and Recht, Benjamin and Ames, Aaron}, booktitle = {Proceedings of the 2020 Conference on Robot Learning}, pages = {654--670}, year = {2021}, editor = {Kober, Jens and Ramos, Fabio and Tomlin, Claire}, volume = {155}, series = {Proceedings of Machine Learning Research}, month = {16--18 Nov}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v155/dean21a/dean21a.pdf}, url = {https://proceedings.mlr.press/v155/dean21a.html}, abstract = {Modern nonlinear control theory seeks to develop feedback controllers that endow systems with properties such as safety and stability. The guarantees ensured by these controllers often rely on accurate estimates of the system state for determining control actions. In practice, measurement model uncertainty can lead to error in state estimates that degrades these guarantees. In this paper, we seek to unify techniques from control theory and machine learning to synthesize controllers that achieve safety in the presence of measurement model uncertainty. We define the notion of a Measurement-Robust Control Barrier Function (MR-CBF) as a tool for determining safe control inputs when facing measurement model uncertainty. Furthermore, MR-CBFs are used to inform sampling methodologies for learning-based perception systems and quantify tolerable error in the resulting learned models. We demonstrate the efficacy of MR-CBFs in achieving safety with measurement model uncertainty on a simulated Segway system.} }
Endnote
%0 Conference Paper %T Guaranteeing Safety of Learned Perception Modules via Measurement-Robust Control Barrier Functions %A Sarah Dean %A Andrew Taylor %A Ryan Cosner %A Benjamin Recht %A Aaron Ames %B Proceedings of the 2020 Conference on Robot Learning %C Proceedings of Machine Learning Research %D 2021 %E Jens Kober %E Fabio Ramos %E Claire Tomlin %F pmlr-v155-dean21a %I PMLR %P 654--670 %U https://proceedings.mlr.press/v155/dean21a.html %V 155 %X Modern nonlinear control theory seeks to develop feedback controllers that endow systems with properties such as safety and stability. The guarantees ensured by these controllers often rely on accurate estimates of the system state for determining control actions. In practice, measurement model uncertainty can lead to error in state estimates that degrades these guarantees. In this paper, we seek to unify techniques from control theory and machine learning to synthesize controllers that achieve safety in the presence of measurement model uncertainty. We define the notion of a Measurement-Robust Control Barrier Function (MR-CBF) as a tool for determining safe control inputs when facing measurement model uncertainty. Furthermore, MR-CBFs are used to inform sampling methodologies for learning-based perception systems and quantify tolerable error in the resulting learned models. We demonstrate the efficacy of MR-CBFs in achieving safety with measurement model uncertainty on a simulated Segway system.
APA
Dean, S., Taylor, A., Cosner, R., Recht, B. & Ames, A.. (2021). Guaranteeing Safety of Learned Perception Modules via Measurement-Robust Control Barrier Functions. Proceedings of the 2020 Conference on Robot Learning, in Proceedings of Machine Learning Research 155:654-670 Available from https://proceedings.mlr.press/v155/dean21a.html.

Related Material