BNN-DP: Robustness Certification of Bayesian Neural Networks via Dynamic Programming

Steven Adams, Andrea Patane, Morteza Lahijanian, Luca Laurenti
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:133-151, 2023.

Abstract

In this paper, we introduce BNN-DP, an efficient algorithmic framework for analysis of adversarial robustness of Bayesian Neural Networks (BNNs). Given a compact set of input points $T\subset \mathbb{R}^n$, BNN-DP computes lower and upper bounds on the BNN’s predictions for all the points in $T$. The framework is based on an interpretation of BNNs as stochastic dynamical systems, which enables the use of Dynamic Programming (DP) algorithms to bound the prediction range along the layers of the network. Specifically, the method uses bound propagation techniques and convex relaxations to derive a backward recursion procedure to over-approximate the prediction range of the BNN with piecewise affine functions. The algorithm is general and can handle both regression and classification tasks. On a set of experiments on various regression and classification tasks and BNN architectures, we show that BNN-DP outperforms state-of-the-art methods by up to four orders of magnitude in both tightness of the bounds and computational efficiency.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-adams23a, title = {{BNN}-{DP}: Robustness Certification of {B}ayesian Neural Networks via Dynamic Programming}, author = {Adams, Steven and Patane, Andrea and Lahijanian, Morteza and Laurenti, Luca}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {133--151}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/adams23a/adams23a.pdf}, url = {https://proceedings.mlr.press/v202/adams23a.html}, abstract = {In this paper, we introduce BNN-DP, an efficient algorithmic framework for analysis of adversarial robustness of Bayesian Neural Networks (BNNs). Given a compact set of input points $T\subset \mathbb{R}^n$, BNN-DP computes lower and upper bounds on the BNN’s predictions for all the points in $T$. The framework is based on an interpretation of BNNs as stochastic dynamical systems, which enables the use of Dynamic Programming (DP) algorithms to bound the prediction range along the layers of the network. Specifically, the method uses bound propagation techniques and convex relaxations to derive a backward recursion procedure to over-approximate the prediction range of the BNN with piecewise affine functions. The algorithm is general and can handle both regression and classification tasks. On a set of experiments on various regression and classification tasks and BNN architectures, we show that BNN-DP outperforms state-of-the-art methods by up to four orders of magnitude in both tightness of the bounds and computational efficiency.} }
Endnote
%0 Conference Paper %T BNN-DP: Robustness Certification of Bayesian Neural Networks via Dynamic Programming %A Steven Adams %A Andrea Patane %A Morteza Lahijanian %A Luca Laurenti %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-adams23a %I PMLR %P 133--151 %U https://proceedings.mlr.press/v202/adams23a.html %V 202 %X In this paper, we introduce BNN-DP, an efficient algorithmic framework for analysis of adversarial robustness of Bayesian Neural Networks (BNNs). Given a compact set of input points $T\subset \mathbb{R}^n$, BNN-DP computes lower and upper bounds on the BNN’s predictions for all the points in $T$. The framework is based on an interpretation of BNNs as stochastic dynamical systems, which enables the use of Dynamic Programming (DP) algorithms to bound the prediction range along the layers of the network. Specifically, the method uses bound propagation techniques and convex relaxations to derive a backward recursion procedure to over-approximate the prediction range of the BNN with piecewise affine functions. The algorithm is general and can handle both regression and classification tasks. On a set of experiments on various regression and classification tasks and BNN architectures, we show that BNN-DP outperforms state-of-the-art methods by up to four orders of magnitude in both tightness of the bounds and computational efficiency.
APA
Adams, S., Patane, A., Lahijanian, M. & Laurenti, L.. (2023). BNN-DP: Robustness Certification of Bayesian Neural Networks via Dynamic Programming. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:133-151 Available from https://proceedings.mlr.press/v202/adams23a.html.

Related Material