Distribution Free Prediction Sets for Node Classification

Jase Clarkson
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:6268-6278, 2023.

Abstract

Graph Neural Networks (GNNs) are able to achieve high classification accuracy on many important real world datasets, but provide no rigorous notion of predictive uncertainty. Quantifying the confidence of GNN models is difficult due to the dependence between datapoints induced by the graph structure. We leverage recent advances in conformal prediction to construct prediction sets for node classification in inductive learning scenarios. We do this by taking an existing approach for conformal classification that relies on exchangeable data and modifying it by appropriately weighting the conformal scores to reflect the network structure. We show through experiments on standard benchmark datasets using popular GNN models that our approach provides tighter and better calibrated prediction sets than a naive application of conformal prediction.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-clarkson23a, title = {Distribution Free Prediction Sets for Node Classification}, author = {Clarkson, Jase}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {6268--6278}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/clarkson23a/clarkson23a.pdf}, url = {https://proceedings.mlr.press/v202/clarkson23a.html}, abstract = {Graph Neural Networks (GNNs) are able to achieve high classification accuracy on many important real world datasets, but provide no rigorous notion of predictive uncertainty. Quantifying the confidence of GNN models is difficult due to the dependence between datapoints induced by the graph structure. We leverage recent advances in conformal prediction to construct prediction sets for node classification in inductive learning scenarios. We do this by taking an existing approach for conformal classification that relies on exchangeable data and modifying it by appropriately weighting the conformal scores to reflect the network structure. We show through experiments on standard benchmark datasets using popular GNN models that our approach provides tighter and better calibrated prediction sets than a naive application of conformal prediction.} }
Endnote
%0 Conference Paper %T Distribution Free Prediction Sets for Node Classification %A Jase Clarkson %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-clarkson23a %I PMLR %P 6268--6278 %U https://proceedings.mlr.press/v202/clarkson23a.html %V 202 %X Graph Neural Networks (GNNs) are able to achieve high classification accuracy on many important real world datasets, but provide no rigorous notion of predictive uncertainty. Quantifying the confidence of GNN models is difficult due to the dependence between datapoints induced by the graph structure. We leverage recent advances in conformal prediction to construct prediction sets for node classification in inductive learning scenarios. We do this by taking an existing approach for conformal classification that relies on exchangeable data and modifying it by appropriately weighting the conformal scores to reflect the network structure. We show through experiments on standard benchmark datasets using popular GNN models that our approach provides tighter and better calibrated prediction sets than a naive application of conformal prediction.
APA
Clarkson, J.. (2023). Distribution Free Prediction Sets for Node Classification. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:6268-6278 Available from https://proceedings.mlr.press/v202/clarkson23a.html.

Related Material