PDHG-Unrolled Learning-to-Optimize Method for Large-Scale Linear Programming

Bingheng Li, Linxin Yang, Yupeng Chen, Senmiao Wang, Haitao Mao, Qian Chen, Yao Ma, Akang Wang, Tian Ding, Jiliang Tang, Ruoyu Sun
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:29164-29180, 2024.

Abstract

Solving large-scale linear programming (LP) problems is an important task in various areas such as communication networks, power systems, finance and logistics. Recently, two distinct approaches have emerged to expedite LP solving: (i) First-order methods (FOMs); (ii) Learning to optimize (L2O). In this work, we propose an FOM-unrolled neural network (NN) called PDHG-Net, and propose a two-stage L2O method to solve large-scale LP problems. The new architecture PDHG-Net is designed by unrolling the recently emerged PDHG method into a neural network, combined with channel-expansion techniques borrowed from graph neural networks. We prove that the proposed PDHG-Net can recover PDHG algorithm, thus can approximate optimal solutions of LP instances with a polynomial number of neurons. We propose a two-stage inference approach: first use PDHG-Net to generate an approximate solution, and then apply PDHG algorithm to further improve the solution. Experiments show that our approach can significantly accelerate LP solving, achieving up to a 3$\times$ speedup compared to FOMs for large-scale LP problems.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-li24ce, title = {{PDHG}-Unrolled Learning-to-Optimize Method for Large-Scale Linear Programming}, author = {Li, Bingheng and Yang, Linxin and Chen, Yupeng and Wang, Senmiao and Mao, Haitao and Chen, Qian and Ma, Yao and Wang, Akang and Ding, Tian and Tang, Jiliang and Sun, Ruoyu}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {29164--29180}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/li24ce/li24ce.pdf}, url = {https://proceedings.mlr.press/v235/li24ce.html}, abstract = {Solving large-scale linear programming (LP) problems is an important task in various areas such as communication networks, power systems, finance and logistics. Recently, two distinct approaches have emerged to expedite LP solving: (i) First-order methods (FOMs); (ii) Learning to optimize (L2O). In this work, we propose an FOM-unrolled neural network (NN) called PDHG-Net, and propose a two-stage L2O method to solve large-scale LP problems. The new architecture PDHG-Net is designed by unrolling the recently emerged PDHG method into a neural network, combined with channel-expansion techniques borrowed from graph neural networks. We prove that the proposed PDHG-Net can recover PDHG algorithm, thus can approximate optimal solutions of LP instances with a polynomial number of neurons. We propose a two-stage inference approach: first use PDHG-Net to generate an approximate solution, and then apply PDHG algorithm to further improve the solution. Experiments show that our approach can significantly accelerate LP solving, achieving up to a 3$\times$ speedup compared to FOMs for large-scale LP problems.} }
Endnote
%0 Conference Paper %T PDHG-Unrolled Learning-to-Optimize Method for Large-Scale Linear Programming %A Bingheng Li %A Linxin Yang %A Yupeng Chen %A Senmiao Wang %A Haitao Mao %A Qian Chen %A Yao Ma %A Akang Wang %A Tian Ding %A Jiliang Tang %A Ruoyu Sun %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-li24ce %I PMLR %P 29164--29180 %U https://proceedings.mlr.press/v235/li24ce.html %V 235 %X Solving large-scale linear programming (LP) problems is an important task in various areas such as communication networks, power systems, finance and logistics. Recently, two distinct approaches have emerged to expedite LP solving: (i) First-order methods (FOMs); (ii) Learning to optimize (L2O). In this work, we propose an FOM-unrolled neural network (NN) called PDHG-Net, and propose a two-stage L2O method to solve large-scale LP problems. The new architecture PDHG-Net is designed by unrolling the recently emerged PDHG method into a neural network, combined with channel-expansion techniques borrowed from graph neural networks. We prove that the proposed PDHG-Net can recover PDHG algorithm, thus can approximate optimal solutions of LP instances with a polynomial number of neurons. We propose a two-stage inference approach: first use PDHG-Net to generate an approximate solution, and then apply PDHG algorithm to further improve the solution. Experiments show that our approach can significantly accelerate LP solving, achieving up to a 3$\times$ speedup compared to FOMs for large-scale LP problems.
APA
Li, B., Yang, L., Chen, Y., Wang, S., Mao, H., Chen, Q., Ma, Y., Wang, A., Ding, T., Tang, J. & Sun, R.. (2024). PDHG-Unrolled Learning-to-Optimize Method for Large-Scale Linear Programming. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:29164-29180 Available from https://proceedings.mlr.press/v235/li24ce.html.

Related Material