Bayesian Graph Neural Networks with Adaptive Connection Sampling

Arman Hasanzadeh, Ehsan Hajiramezanali, Shahin Boluki, Mingyuan Zhou, Nick Duffield, Krishna Narayanan, Xiaoning Qian
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:4094-4104, 2020.

Abstract

We propose a unified framework for adaptive connection sampling in graph neural networks (GNNs) that generalizes existing stochastic regularization methods for training GNNs. The proposed framework not only alleviates over-smoothing and over-fitting tendencies of deep GNNs, but also enables learning with uncertainty in graph analytic tasks with GNNs. Instead of using fixed sampling rates or hand-tuning themas model hyperparameters in existing stochastic regularization methods, our adaptive connection sampling can be trained jointly with GNN model parameters in both global and local fashions. GNN training with adaptive connection sampling is shown to be mathematically equivalent to an efficient approximation of training BayesianGNNs. Experimental results with ablation studies on benchmark datasets validate that adaptively learning the sampling rate given graph training data is the key to boost the performance of GNNs in semi-supervised node classification, less prone to over-smoothing and over-fitting with more robust prediction.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-hasanzadeh20a, title = {{B}ayesian Graph Neural Networks with Adaptive Connection Sampling}, author = {Hasanzadeh, Arman and Hajiramezanali, Ehsan and Boluki, Shahin and Zhou, Mingyuan and Duffield, Nick and Narayanan, Krishna and Qian, Xiaoning}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {4094--4104}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/hasanzadeh20a/hasanzadeh20a.pdf}, url = {https://proceedings.mlr.press/v119/hasanzadeh20a.html}, abstract = {We propose a unified framework for adaptive connection sampling in graph neural networks (GNNs) that generalizes existing stochastic regularization methods for training GNNs. The proposed framework not only alleviates over-smoothing and over-fitting tendencies of deep GNNs, but also enables learning with uncertainty in graph analytic tasks with GNNs. Instead of using fixed sampling rates or hand-tuning themas model hyperparameters in existing stochastic regularization methods, our adaptive connection sampling can be trained jointly with GNN model parameters in both global and local fashions. GNN training with adaptive connection sampling is shown to be mathematically equivalent to an efficient approximation of training BayesianGNNs. Experimental results with ablation studies on benchmark datasets validate that adaptively learning the sampling rate given graph training data is the key to boost the performance of GNNs in semi-supervised node classification, less prone to over-smoothing and over-fitting with more robust prediction.} }
Endnote
%0 Conference Paper %T Bayesian Graph Neural Networks with Adaptive Connection Sampling %A Arman Hasanzadeh %A Ehsan Hajiramezanali %A Shahin Boluki %A Mingyuan Zhou %A Nick Duffield %A Krishna Narayanan %A Xiaoning Qian %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-hasanzadeh20a %I PMLR %P 4094--4104 %U https://proceedings.mlr.press/v119/hasanzadeh20a.html %V 119 %X We propose a unified framework for adaptive connection sampling in graph neural networks (GNNs) that generalizes existing stochastic regularization methods for training GNNs. The proposed framework not only alleviates over-smoothing and over-fitting tendencies of deep GNNs, but also enables learning with uncertainty in graph analytic tasks with GNNs. Instead of using fixed sampling rates or hand-tuning themas model hyperparameters in existing stochastic regularization methods, our adaptive connection sampling can be trained jointly with GNN model parameters in both global and local fashions. GNN training with adaptive connection sampling is shown to be mathematically equivalent to an efficient approximation of training BayesianGNNs. Experimental results with ablation studies on benchmark datasets validate that adaptively learning the sampling rate given graph training data is the key to boost the performance of GNNs in semi-supervised node classification, less prone to over-smoothing and over-fitting with more robust prediction.
APA
Hasanzadeh, A., Hajiramezanali, E., Boluki, S., Zhou, M., Duffield, N., Narayanan, K. & Qian, X.. (2020). Bayesian Graph Neural Networks with Adaptive Connection Sampling. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:4094-4104 Available from https://proceedings.mlr.press/v119/hasanzadeh20a.html.

Related Material