Gradient-Informed Neural Network Statistical Robustness Estimation

Karim TIT, Teddy Furon, Mathias Rousset
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:323-334, 2023.

Abstract

Deep neural networks are robust against random corruptions of the inputs to some extent. This global sense of safety is not sufficient in critical applications where probabilities of failure must be assessed with accuracy. Some previous works applied known statistical methods from the field of rare event analysis to classification. Yet, they use classifiers as black-box models without taking into account gradient information, readily available for deep learning models via auto-differentiation. We propose a new and highly efficient estimator of probabilities of failure dedicated to neural networks as it leverages the fast computation of gradients of the model through back-propagation.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-tit23a, title = {Gradient-Informed Neural Network Statistical Robustness Estimation}, author = {TIT, Karim and Furon, Teddy and Rousset, Mathias}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {323--334}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/tit23a/tit23a.pdf}, url = {https://proceedings.mlr.press/v206/tit23a.html}, abstract = {Deep neural networks are robust against random corruptions of the inputs to some extent. This global sense of safety is not sufficient in critical applications where probabilities of failure must be assessed with accuracy. Some previous works applied known statistical methods from the field of rare event analysis to classification. Yet, they use classifiers as black-box models without taking into account gradient information, readily available for deep learning models via auto-differentiation. We propose a new and highly efficient estimator of probabilities of failure dedicated to neural networks as it leverages the fast computation of gradients of the model through back-propagation.} }
Endnote
%0 Conference Paper %T Gradient-Informed Neural Network Statistical Robustness Estimation %A Karim TIT %A Teddy Furon %A Mathias Rousset %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-tit23a %I PMLR %P 323--334 %U https://proceedings.mlr.press/v206/tit23a.html %V 206 %X Deep neural networks are robust against random corruptions of the inputs to some extent. This global sense of safety is not sufficient in critical applications where probabilities of failure must be assessed with accuracy. Some previous works applied known statistical methods from the field of rare event analysis to classification. Yet, they use classifiers as black-box models without taking into account gradient information, readily available for deep learning models via auto-differentiation. We propose a new and highly efficient estimator of probabilities of failure dedicated to neural networks as it leverages the fast computation of gradients of the model through back-propagation.
APA
TIT, K., Furon, T. & Rousset, M.. (2023). Gradient-Informed Neural Network Statistical Robustness Estimation. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:323-334 Available from https://proceedings.mlr.press/v206/tit23a.html.

Related Material