MG-GNN: Multigrid Graph Neural Networks for Learning Multilevel Domain Decomposition Methods

Ali Taghibakhshi, Nicolas Nytko, Tareq Uz Zaman, Scott Maclachlan, Luke Olson, Matthew West
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:33381-33395, 2023.

Abstract

Domain decomposition methods (DDMs) are popular solvers for discretized systems of partial differential equations (PDEs), with one-level and multilevel variants. These solvers rely on several algorithmic and mathematical parameters, prescribing overlap, subdomain boundary conditions, and other properties of the DDM. While some work has been done on optimizing these parameters, it has mostly focused on the one-level setting or special cases such as structured-grid discretizations with regular subdomain construction. In this paper, we propose multigrid graph neural networks (MG-GNN), a novel GNN architecture for learning optimized parameters in two-level DDMs. We train MG-GNN using a new unsupervised loss function, enabling effective training on small problems that yields robust performance on unstructured grids that are orders of magnitude larger than those in the training set. We show that MG-GNN outperforms popular hierarchical graph network architectures for this optimization and that our proposed loss function is critical to achieving this improved performance.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-taghibakhshi23a, title = {{MG}-{GNN}: Multigrid Graph Neural Networks for Learning Multilevel Domain Decomposition Methods}, author = {Taghibakhshi, Ali and Nytko, Nicolas and Zaman, Tareq Uz and Maclachlan, Scott and Olson, Luke and West, Matthew}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {33381--33395}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/taghibakhshi23a/taghibakhshi23a.pdf}, url = {https://proceedings.mlr.press/v202/taghibakhshi23a.html}, abstract = {Domain decomposition methods (DDMs) are popular solvers for discretized systems of partial differential equations (PDEs), with one-level and multilevel variants. These solvers rely on several algorithmic and mathematical parameters, prescribing overlap, subdomain boundary conditions, and other properties of the DDM. While some work has been done on optimizing these parameters, it has mostly focused on the one-level setting or special cases such as structured-grid discretizations with regular subdomain construction. In this paper, we propose multigrid graph neural networks (MG-GNN), a novel GNN architecture for learning optimized parameters in two-level DDMs. We train MG-GNN using a new unsupervised loss function, enabling effective training on small problems that yields robust performance on unstructured grids that are orders of magnitude larger than those in the training set. We show that MG-GNN outperforms popular hierarchical graph network architectures for this optimization and that our proposed loss function is critical to achieving this improved performance.} }
Endnote
%0 Conference Paper %T MG-GNN: Multigrid Graph Neural Networks for Learning Multilevel Domain Decomposition Methods %A Ali Taghibakhshi %A Nicolas Nytko %A Tareq Uz Zaman %A Scott Maclachlan %A Luke Olson %A Matthew West %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-taghibakhshi23a %I PMLR %P 33381--33395 %U https://proceedings.mlr.press/v202/taghibakhshi23a.html %V 202 %X Domain decomposition methods (DDMs) are popular solvers for discretized systems of partial differential equations (PDEs), with one-level and multilevel variants. These solvers rely on several algorithmic and mathematical parameters, prescribing overlap, subdomain boundary conditions, and other properties of the DDM. While some work has been done on optimizing these parameters, it has mostly focused on the one-level setting or special cases such as structured-grid discretizations with regular subdomain construction. In this paper, we propose multigrid graph neural networks (MG-GNN), a novel GNN architecture for learning optimized parameters in two-level DDMs. We train MG-GNN using a new unsupervised loss function, enabling effective training on small problems that yields robust performance on unstructured grids that are orders of magnitude larger than those in the training set. We show that MG-GNN outperforms popular hierarchical graph network architectures for this optimization and that our proposed loss function is critical to achieving this improved performance.
APA
Taghibakhshi, A., Nytko, N., Zaman, T.U., Maclachlan, S., Olson, L. & West, M.. (2023). MG-GNN: Multigrid Graph Neural Networks for Learning Multilevel Domain Decomposition Methods. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:33381-33395 Available from https://proceedings.mlr.press/v202/taghibakhshi23a.html.

Related Material