Input uncertainty propagation through trained neural networks

Paul Monchot, Loic Coquelin, Sébastien Julien Petit, Sébastien Marmin, Erwan Le Pennec, Nicolas Fischer
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:25140-25173, 2023.

Abstract

When physical sensors are involved, such as image sensors, the uncertainty over the input data is often a major component of the output uncertainty of machine learning models. In this work, we address the problem of input uncertainty propagation through trained neural networks. We do not rely on a Gaussian distribution assumption of the output or of any intermediate layer. We propagate instead a Gaussian Mixture Model (GMM) that offers much more flexibility, using the Split&Merge algorithm. This paper’s main contribution is the computation of a Wasserstein criterion to control the Gaussian splitting procedure for which theoretical guarantees of convergence on the output distribution estimates are derived. The methodology is tested against a wide range of datasets and networks. It shows robustness, and genericity and offers highly accurate output probability density function estimation while maintaining a reasonable computational cost compared with the standard Monte Carlo (MC) approach.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-monchot23a, title = {Input uncertainty propagation through trained neural networks}, author = {Monchot, Paul and Coquelin, Loic and Petit, S\'{e}bastien Julien and Marmin, S\'{e}bastien and Le Pennec, Erwan and Fischer, Nicolas}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {25140--25173}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/monchot23a/monchot23a.pdf}, url = {https://proceedings.mlr.press/v202/monchot23a.html}, abstract = {When physical sensors are involved, such as image sensors, the uncertainty over the input data is often a major component of the output uncertainty of machine learning models. In this work, we address the problem of input uncertainty propagation through trained neural networks. We do not rely on a Gaussian distribution assumption of the output or of any intermediate layer. We propagate instead a Gaussian Mixture Model (GMM) that offers much more flexibility, using the Split&Merge algorithm. This paper’s main contribution is the computation of a Wasserstein criterion to control the Gaussian splitting procedure for which theoretical guarantees of convergence on the output distribution estimates are derived. The methodology is tested against a wide range of datasets and networks. It shows robustness, and genericity and offers highly accurate output probability density function estimation while maintaining a reasonable computational cost compared with the standard Monte Carlo (MC) approach.} }
Endnote
%0 Conference Paper %T Input uncertainty propagation through trained neural networks %A Paul Monchot %A Loic Coquelin %A Sébastien Julien Petit %A Sébastien Marmin %A Erwan Le Pennec %A Nicolas Fischer %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-monchot23a %I PMLR %P 25140--25173 %U https://proceedings.mlr.press/v202/monchot23a.html %V 202 %X When physical sensors are involved, such as image sensors, the uncertainty over the input data is often a major component of the output uncertainty of machine learning models. In this work, we address the problem of input uncertainty propagation through trained neural networks. We do not rely on a Gaussian distribution assumption of the output or of any intermediate layer. We propagate instead a Gaussian Mixture Model (GMM) that offers much more flexibility, using the Split&Merge algorithm. This paper’s main contribution is the computation of a Wasserstein criterion to control the Gaussian splitting procedure for which theoretical guarantees of convergence on the output distribution estimates are derived. The methodology is tested against a wide range of datasets and networks. It shows robustness, and genericity and offers highly accurate output probability density function estimation while maintaining a reasonable computational cost compared with the standard Monte Carlo (MC) approach.
APA
Monchot, P., Coquelin, L., Petit, S.J., Marmin, S., Le Pennec, E. & Fischer, N.. (2023). Input uncertainty propagation through trained neural networks. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:25140-25173 Available from https://proceedings.mlr.press/v202/monchot23a.html.

Related Material