Conformal Prediction Sets for Graph Neural Networks

Soroush H. Zargarbashi, Simone Antonelli, Aleksandar Bojchevski
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:12292-12318, 2023.

Abstract

Despite the widespread use of graph neural networks (GNNs) we lack methods to reliably quantify their uncertainty. We propose a conformal procedure to equip GNNs with prediction sets that come with distribution-free guarantees – the output set contains the true label with arbitrarily high probability. Our post-processing procedure can wrap around any (pretrained) GNN, and unlike existing methods, results in meaningful sets even when the model provides only the top class. The key idea is to diffuse the node-wise conformity scores to incorporate neighborhood information. By leveraging the network homophily we construct sets with comparable or better efficiency (average size) and significantly improved singleton hit ratio (correct sets of size one). In addition to an extensive empirical evaluation, we investigate the theoretical conditions under which smoothing provably improves efficiency.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-h-zargarbashi23a, title = {Conformal Prediction Sets for Graph Neural Networks}, author = {H. Zargarbashi, Soroush and Antonelli, Simone and Bojchevski, Aleksandar}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {12292--12318}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/h-zargarbashi23a/h-zargarbashi23a.pdf}, url = {https://proceedings.mlr.press/v202/h-zargarbashi23a.html}, abstract = {Despite the widespread use of graph neural networks (GNNs) we lack methods to reliably quantify their uncertainty. We propose a conformal procedure to equip GNNs with prediction sets that come with distribution-free guarantees – the output set contains the true label with arbitrarily high probability. Our post-processing procedure can wrap around any (pretrained) GNN, and unlike existing methods, results in meaningful sets even when the model provides only the top class. The key idea is to diffuse the node-wise conformity scores to incorporate neighborhood information. By leveraging the network homophily we construct sets with comparable or better efficiency (average size) and significantly improved singleton hit ratio (correct sets of size one). In addition to an extensive empirical evaluation, we investigate the theoretical conditions under which smoothing provably improves efficiency.} }
Endnote
%0 Conference Paper %T Conformal Prediction Sets for Graph Neural Networks %A Soroush H. Zargarbashi %A Simone Antonelli %A Aleksandar Bojchevski %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-h-zargarbashi23a %I PMLR %P 12292--12318 %U https://proceedings.mlr.press/v202/h-zargarbashi23a.html %V 202 %X Despite the widespread use of graph neural networks (GNNs) we lack methods to reliably quantify their uncertainty. We propose a conformal procedure to equip GNNs with prediction sets that come with distribution-free guarantees – the output set contains the true label with arbitrarily high probability. Our post-processing procedure can wrap around any (pretrained) GNN, and unlike existing methods, results in meaningful sets even when the model provides only the top class. The key idea is to diffuse the node-wise conformity scores to incorporate neighborhood information. By leveraging the network homophily we construct sets with comparable or better efficiency (average size) and significantly improved singleton hit ratio (correct sets of size one). In addition to an extensive empirical evaluation, we investigate the theoretical conditions under which smoothing provably improves efficiency.
APA
H. Zargarbashi, S., Antonelli, S. & Bojchevski, A.. (2023). Conformal Prediction Sets for Graph Neural Networks. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:12292-12318 Available from https://proceedings.mlr.press/v202/h-zargarbashi23a.html.

Related Material