HEAP: Hyper Extended A-PDHG Operator for Constrained High-dim PDEs

Mingquan Feng, Weixin Liao, Yixin Huang, Yifan Fu, Qifu Zheng, Junchi Yan
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:16825-16836, 2025.

Abstract

Neural operators have emerged as a promising approach for solving high-dimensional partial differential equations (PDEs). However, existing neural operators often have difficulty in dealing with constrained PDEs, where the solution must satisfy additional equality or inequality constraints beyond the governing equations. To close this gap, we propose a novel neural operator, Hyper Extended Adaptive PDHG (HEAP) for constrained high-dim PDEs, where the learned operator evolves in the parameter space of PDEs. We first show that the evolution operator learning can be formulated as a quadratic programming (QP) problem, then unroll the adaptive primal-dual hybrid gradient (APDHG) algorithm as the QP-solver into the neural operator architecture. This allows us to improve efficiency while retaining theoretical guarantees of the constrained optimization. Empirical results on a variety of high-dim PDEs show that HEAP outperforms the state-of-the-art neural operator model.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-feng25k, title = {{HEAP}: Hyper Extended A-{PDHG} Operator for Constrained High-dim {PDE}s}, author = {Feng, Mingquan and Liao, Weixin and Huang, Yixin and Fu, Yifan and Zheng, Qifu and Yan, Junchi}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {16825--16836}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/feng25k/feng25k.pdf}, url = {https://proceedings.mlr.press/v267/feng25k.html}, abstract = {Neural operators have emerged as a promising approach for solving high-dimensional partial differential equations (PDEs). However, existing neural operators often have difficulty in dealing with constrained PDEs, where the solution must satisfy additional equality or inequality constraints beyond the governing equations. To close this gap, we propose a novel neural operator, Hyper Extended Adaptive PDHG (HEAP) for constrained high-dim PDEs, where the learned operator evolves in the parameter space of PDEs. We first show that the evolution operator learning can be formulated as a quadratic programming (QP) problem, then unroll the adaptive primal-dual hybrid gradient (APDHG) algorithm as the QP-solver into the neural operator architecture. This allows us to improve efficiency while retaining theoretical guarantees of the constrained optimization. Empirical results on a variety of high-dim PDEs show that HEAP outperforms the state-of-the-art neural operator model.} }
Endnote
%0 Conference Paper %T HEAP: Hyper Extended A-PDHG Operator for Constrained High-dim PDEs %A Mingquan Feng %A Weixin Liao %A Yixin Huang %A Yifan Fu %A Qifu Zheng %A Junchi Yan %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-feng25k %I PMLR %P 16825--16836 %U https://proceedings.mlr.press/v267/feng25k.html %V 267 %X Neural operators have emerged as a promising approach for solving high-dimensional partial differential equations (PDEs). However, existing neural operators often have difficulty in dealing with constrained PDEs, where the solution must satisfy additional equality or inequality constraints beyond the governing equations. To close this gap, we propose a novel neural operator, Hyper Extended Adaptive PDHG (HEAP) for constrained high-dim PDEs, where the learned operator evolves in the parameter space of PDEs. We first show that the evolution operator learning can be formulated as a quadratic programming (QP) problem, then unroll the adaptive primal-dual hybrid gradient (APDHG) algorithm as the QP-solver into the neural operator architecture. This allows us to improve efficiency while retaining theoretical guarantees of the constrained optimization. Empirical results on a variety of high-dim PDEs show that HEAP outperforms the state-of-the-art neural operator model.
APA
Feng, M., Liao, W., Huang, Y., Fu, Y., Zheng, Q. & Yan, J.. (2025). HEAP: Hyper Extended A-PDHG Operator for Constrained High-dim PDEs. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:16825-16836 Available from https://proceedings.mlr.press/v267/feng25k.html.

Related Material