[edit]
Faster Optimization on Sparse Graphs via Neural Reparametrization
Proceedings of the Third Learning on Graphs Conference, PMLR 269:26:1-26:21, 2025.
Abstract
Many scientific problems involve energy minimization on sparse graphs, including such as heat diffusion in solids and synchronization. These problems often experience slow convergence, particularly when the graph is random, as seen in glassy systems. We show that graph neural networks (GNN) can be used to significantly speed up such optimization problems. Our idea is to represent the state of each node as the output of a graph neural network. We show the benefit of this GNN reparametrization in experiments solving heat diffusion and synchronization of nonlinear oscillators. When optimizing using gradient descent, we show that this GNN reparametrization has the effect of a quasi-Newton’s method.