Theory-guided Message Passing Neural Network for Probabilistic Inference

Zijun Cui, Hanjing Wang, Tian Gao, Kartik Talamadupula, Qiang Ji
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:667-675, 2024.

Abstract

Probabilistic inference can be tackled by minimizing a variational free energy through message passing. To improve performance, neural networks are adopted for message computation. Neural message learning is heuristic and requires strong guidance to perform well. In this work, we propose a {\em theory-guided message passing neural network} (TMPNN) for probabilistic inference. Inspired by existing work, we consider a generalized Bethe free energy which allows for a learnable variational assumption. Instead of using a black-box neural network for message computation, we utilize a general message equation and introduce a symbolic message function with semantically meaningful parameters. The analytically derived symbolic message function is seamlessly integrated into the MPNN framework, giving rise to the proposed TMPNN. TMPNN is trained using algorithmic supervision without requiring exact inference results. Leveraging the theory-guided symbolic function, TMPNN offers strengthened theoretical guarantees compared to conventional heuristic neural models. It presents a novel contribution by demonstrating its applicability to both MAP and marginal inference tasks, outperforming SOTAs in both cases. Furthermore, TMPNN provides improved generalizability across various graph structures and exhibits enhanced data efficiency.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-cui24a, title = {Theory-guided Message Passing Neural Network for Probabilistic Inference}, author = {Cui, Zijun and Wang, Hanjing and Gao, Tian and Talamadupula, Kartik and Ji, Qiang}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {667--675}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/cui24a/cui24a.pdf}, url = {https://proceedings.mlr.press/v238/cui24a.html}, abstract = {Probabilistic inference can be tackled by minimizing a variational free energy through message passing. To improve performance, neural networks are adopted for message computation. Neural message learning is heuristic and requires strong guidance to perform well. In this work, we propose a {\em theory-guided message passing neural network} (TMPNN) for probabilistic inference. Inspired by existing work, we consider a generalized Bethe free energy which allows for a learnable variational assumption. Instead of using a black-box neural network for message computation, we utilize a general message equation and introduce a symbolic message function with semantically meaningful parameters. The analytically derived symbolic message function is seamlessly integrated into the MPNN framework, giving rise to the proposed TMPNN. TMPNN is trained using algorithmic supervision without requiring exact inference results. Leveraging the theory-guided symbolic function, TMPNN offers strengthened theoretical guarantees compared to conventional heuristic neural models. It presents a novel contribution by demonstrating its applicability to both MAP and marginal inference tasks, outperforming SOTAs in both cases. Furthermore, TMPNN provides improved generalizability across various graph structures and exhibits enhanced data efficiency.} }
Endnote
%0 Conference Paper %T Theory-guided Message Passing Neural Network for Probabilistic Inference %A Zijun Cui %A Hanjing Wang %A Tian Gao %A Kartik Talamadupula %A Qiang Ji %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-cui24a %I PMLR %P 667--675 %U https://proceedings.mlr.press/v238/cui24a.html %V 238 %X Probabilistic inference can be tackled by minimizing a variational free energy through message passing. To improve performance, neural networks are adopted for message computation. Neural message learning is heuristic and requires strong guidance to perform well. In this work, we propose a {\em theory-guided message passing neural network} (TMPNN) for probabilistic inference. Inspired by existing work, we consider a generalized Bethe free energy which allows for a learnable variational assumption. Instead of using a black-box neural network for message computation, we utilize a general message equation and introduce a symbolic message function with semantically meaningful parameters. The analytically derived symbolic message function is seamlessly integrated into the MPNN framework, giving rise to the proposed TMPNN. TMPNN is trained using algorithmic supervision without requiring exact inference results. Leveraging the theory-guided symbolic function, TMPNN offers strengthened theoretical guarantees compared to conventional heuristic neural models. It presents a novel contribution by demonstrating its applicability to both MAP and marginal inference tasks, outperforming SOTAs in both cases. Furthermore, TMPNN provides improved generalizability across various graph structures and exhibits enhanced data efficiency.
APA
Cui, Z., Wang, H., Gao, T., Talamadupula, K. & Ji, Q.. (2024). Theory-guided Message Passing Neural Network for Probabilistic Inference. Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:667-675 Available from https://proceedings.mlr.press/v238/cui24a.html.

Related Material