Probabilistic Verification of ReLU Neural Networks via Characteristic Functions

Joshua Pilipovsky, Vignesh Sivaramakrishnan, Meeko Oishi, Panagiotis Tsiotras
Proceedings of The 5th Annual Learning for Dynamics and Control Conference, PMLR 211:966-979, 2023.

Abstract

Verifying the input-output relationships of a neural network to achieve desired performance specifications is a difficult, yet important, problem due to the growing ubiquity of neural nets in many engineering applications. We use ideas from probability theory in the frequency domain to provide probabilistic verification guarantees for ReLU neural networks. Specifically, we interpret a (deep) feedforward neural network as a discrete-time dynamical system over a finite horizon that shapes distributions of initial states, and use characteristic functions to propagate the distribution of the input data through the network. Using the inverse Fourier transform, we obtain the corresponding cumulative distribution function of the output set, which we use to check if the network is performing as expected given any random point from the input set. The proposed approach does not require distributions to have well-defined moments or moment generating functions. We demonstrate our proposed approach on two examples, and compare its performance to related approaches.

Cite this Paper


BibTeX
@InProceedings{pmlr-v211-pilipovsky23a, title = {Probabilistic Verification of ReLU Neural Networks via Characteristic Functions}, author = {Pilipovsky, Joshua and Sivaramakrishnan, Vignesh and Oishi, Meeko and Tsiotras, Panagiotis}, booktitle = {Proceedings of The 5th Annual Learning for Dynamics and Control Conference}, pages = {966--979}, year = {2023}, editor = {Matni, Nikolai and Morari, Manfred and Pappas, George J.}, volume = {211}, series = {Proceedings of Machine Learning Research}, month = {15--16 Jun}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v211/pilipovsky23a/pilipovsky23a.pdf}, url = {https://proceedings.mlr.press/v211/pilipovsky23a.html}, abstract = {Verifying the input-output relationships of a neural network to achieve desired performance specifications is a difficult, yet important, problem due to the growing ubiquity of neural nets in many engineering applications. We use ideas from probability theory in the frequency domain to provide probabilistic verification guarantees for ReLU neural networks. Specifically, we interpret a (deep) feedforward neural network as a discrete-time dynamical system over a finite horizon that shapes distributions of initial states, and use characteristic functions to propagate the distribution of the input data through the network. Using the inverse Fourier transform, we obtain the corresponding cumulative distribution function of the output set, which we use to check if the network is performing as expected given any random point from the input set. The proposed approach does not require distributions to have well-defined moments or moment generating functions. We demonstrate our proposed approach on two examples, and compare its performance to related approaches.} }
Endnote
%0 Conference Paper %T Probabilistic Verification of ReLU Neural Networks via Characteristic Functions %A Joshua Pilipovsky %A Vignesh Sivaramakrishnan %A Meeko Oishi %A Panagiotis Tsiotras %B Proceedings of The 5th Annual Learning for Dynamics and Control Conference %C Proceedings of Machine Learning Research %D 2023 %E Nikolai Matni %E Manfred Morari %E George J. Pappas %F pmlr-v211-pilipovsky23a %I PMLR %P 966--979 %U https://proceedings.mlr.press/v211/pilipovsky23a.html %V 211 %X Verifying the input-output relationships of a neural network to achieve desired performance specifications is a difficult, yet important, problem due to the growing ubiquity of neural nets in many engineering applications. We use ideas from probability theory in the frequency domain to provide probabilistic verification guarantees for ReLU neural networks. Specifically, we interpret a (deep) feedforward neural network as a discrete-time dynamical system over a finite horizon that shapes distributions of initial states, and use characteristic functions to propagate the distribution of the input data through the network. Using the inverse Fourier transform, we obtain the corresponding cumulative distribution function of the output set, which we use to check if the network is performing as expected given any random point from the input set. The proposed approach does not require distributions to have well-defined moments or moment generating functions. We demonstrate our proposed approach on two examples, and compare its performance to related approaches.
APA
Pilipovsky, J., Sivaramakrishnan, V., Oishi, M. & Tsiotras, P.. (2023). Probabilistic Verification of ReLU Neural Networks via Characteristic Functions. Proceedings of The 5th Annual Learning for Dynamics and Control Conference, in Proceedings of Machine Learning Research 211:966-979 Available from https://proceedings.mlr.press/v211/pilipovsky23a.html.

Related Material