ScatterSample: Diversified Label Sampling for Data Efficient Graph Neural Network Learning

Zhenwei DAI, Vasileios Ioannidis, Soji Adeshina, Zak Jost, Christos Faloutsos, George Karypis
Proceedings of the First Learning on Graphs Conference, PMLR 198:35:1-35:15, 2022.

Abstract

What target labels are most effective for graph neural network (GNN) training? In some applications where GNNs excel-like drug design or fraud detection, labeling new instances is expensive. We develop a data-efficient active sampling framework, ScatterSample, to train GNNs under an active learning setting. ScatterSample employs a sampling module termed DiverseUncertainty to collect instances with large uncertainty from different regions of the sample space for labeling. To ensure diversification of the selected nodes, DiverseUncertainty clusters the high uncertainty nodes and selects the representative nodes from each cluster. Our ScatterSample algorithm is further supported by rigorous theoretical analysis demonstrating its advantage compared to standard active sampling methods that aim to simply maximize the uncertainty and not diversify the samples. In particular, we show that ScatterSample is able to efficiently reduce the model uncertainty over the whole sample space. Our experiments on five datasets show that ScatterSample significantly outperforms the other GNN active learning baselines, specifically it reduces the sampling cost by up to 50% while achieving the same test accuracy.

Cite this Paper


BibTeX
@InProceedings{pmlr-v198-dai22a, title = {ScatterSample: Diversified Label Sampling for Data Efficient Graph Neural Network Learning}, author = {DAI, Zhenwei and Ioannidis, Vasileios and Adeshina, Soji and Jost, Zak and Faloutsos, Christos and Karypis, George}, booktitle = {Proceedings of the First Learning on Graphs Conference}, pages = {35:1--35:15}, year = {2022}, editor = {Rieck, Bastian and Pascanu, Razvan}, volume = {198}, series = {Proceedings of Machine Learning Research}, month = {09--12 Dec}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v198/dai22a/dai22a.pdf}, url = {https://proceedings.mlr.press/v198/dai22a.html}, abstract = {What target labels are most effective for graph neural network (GNN) training? In some applications where GNNs excel-like drug design or fraud detection, labeling new instances is expensive. We develop a data-efficient active sampling framework, ScatterSample, to train GNNs under an active learning setting. ScatterSample employs a sampling module termed DiverseUncertainty to collect instances with large uncertainty from different regions of the sample space for labeling. To ensure diversification of the selected nodes, DiverseUncertainty clusters the high uncertainty nodes and selects the representative nodes from each cluster. Our ScatterSample algorithm is further supported by rigorous theoretical analysis demonstrating its advantage compared to standard active sampling methods that aim to simply maximize the uncertainty and not diversify the samples. In particular, we show that ScatterSample is able to efficiently reduce the model uncertainty over the whole sample space. Our experiments on five datasets show that ScatterSample significantly outperforms the other GNN active learning baselines, specifically it reduces the sampling cost by up to 50% while achieving the same test accuracy. } }
Endnote
%0 Conference Paper %T ScatterSample: Diversified Label Sampling for Data Efficient Graph Neural Network Learning %A Zhenwei DAI %A Vasileios Ioannidis %A Soji Adeshina %A Zak Jost %A Christos Faloutsos %A George Karypis %B Proceedings of the First Learning on Graphs Conference %C Proceedings of Machine Learning Research %D 2022 %E Bastian Rieck %E Razvan Pascanu %F pmlr-v198-dai22a %I PMLR %P 35:1--35:15 %U https://proceedings.mlr.press/v198/dai22a.html %V 198 %X What target labels are most effective for graph neural network (GNN) training? In some applications where GNNs excel-like drug design or fraud detection, labeling new instances is expensive. We develop a data-efficient active sampling framework, ScatterSample, to train GNNs under an active learning setting. ScatterSample employs a sampling module termed DiverseUncertainty to collect instances with large uncertainty from different regions of the sample space for labeling. To ensure diversification of the selected nodes, DiverseUncertainty clusters the high uncertainty nodes and selects the representative nodes from each cluster. Our ScatterSample algorithm is further supported by rigorous theoretical analysis demonstrating its advantage compared to standard active sampling methods that aim to simply maximize the uncertainty and not diversify the samples. In particular, we show that ScatterSample is able to efficiently reduce the model uncertainty over the whole sample space. Our experiments on five datasets show that ScatterSample significantly outperforms the other GNN active learning baselines, specifically it reduces the sampling cost by up to 50% while achieving the same test accuracy.
APA
DAI, Z., Ioannidis, V., Adeshina, S., Jost, Z., Faloutsos, C. & Karypis, G.. (2022). ScatterSample: Diversified Label Sampling for Data Efficient Graph Neural Network Learning. Proceedings of the First Learning on Graphs Conference, in Proceedings of Machine Learning Research 198:35:1-35:15 Available from https://proceedings.mlr.press/v198/dai22a.html.

Related Material