MVMoE: Multi-Task Vehicle Routing Solver with Mixture-of-Experts

Jianan Zhou, Zhiguang Cao, Yaoxin Wu, Wen Song, Yining Ma, Jie Zhang, Xu Chi
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:61804-61824, 2024.

Abstract

Learning to solve vehicle routing problems (VRPs) has garnered much attention. However, most neural solvers are only structured and trained independently on a specific problem, making them less generic and practical. In this paper, we aim to develop a unified neural solver that can cope with a range of VRP variants simultaneously. Specifically, we propose a multi-task vehicle routing solver with mixture-of-experts (MVMoE), which greatly enhances the model capacity without a proportional increase in computation. We further develop a hierarchical gating mechanism for the MVMoE, delivering a good trade-off between empirical performance and computational complexity. Experimentally, our method significantly promotes zero-shot generalization performance on 10 unseen VRP variants, and showcases decent results on the few-shot setting and real-world benchmark instances. We further conduct extensive studies on the effect of MoE configurations in solving VRPs, and observe the superiority of hierarchical gating when facing out-of-distribution data. The source code is available at: https://github.com/RoyalSkye/Routing-MVMoE.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-zhou24c, title = {{MVM}o{E}: Multi-Task Vehicle Routing Solver with Mixture-of-Experts}, author = {Zhou, Jianan and Cao, Zhiguang and Wu, Yaoxin and Song, Wen and Ma, Yining and Zhang, Jie and Chi, Xu}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {61804--61824}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/zhou24c/zhou24c.pdf}, url = {https://proceedings.mlr.press/v235/zhou24c.html}, abstract = {Learning to solve vehicle routing problems (VRPs) has garnered much attention. However, most neural solvers are only structured and trained independently on a specific problem, making them less generic and practical. In this paper, we aim to develop a unified neural solver that can cope with a range of VRP variants simultaneously. Specifically, we propose a multi-task vehicle routing solver with mixture-of-experts (MVMoE), which greatly enhances the model capacity without a proportional increase in computation. We further develop a hierarchical gating mechanism for the MVMoE, delivering a good trade-off between empirical performance and computational complexity. Experimentally, our method significantly promotes zero-shot generalization performance on 10 unseen VRP variants, and showcases decent results on the few-shot setting and real-world benchmark instances. We further conduct extensive studies on the effect of MoE configurations in solving VRPs, and observe the superiority of hierarchical gating when facing out-of-distribution data. The source code is available at: https://github.com/RoyalSkye/Routing-MVMoE.} }
Endnote
%0 Conference Paper %T MVMoE: Multi-Task Vehicle Routing Solver with Mixture-of-Experts %A Jianan Zhou %A Zhiguang Cao %A Yaoxin Wu %A Wen Song %A Yining Ma %A Jie Zhang %A Xu Chi %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-zhou24c %I PMLR %P 61804--61824 %U https://proceedings.mlr.press/v235/zhou24c.html %V 235 %X Learning to solve vehicle routing problems (VRPs) has garnered much attention. However, most neural solvers are only structured and trained independently on a specific problem, making them less generic and practical. In this paper, we aim to develop a unified neural solver that can cope with a range of VRP variants simultaneously. Specifically, we propose a multi-task vehicle routing solver with mixture-of-experts (MVMoE), which greatly enhances the model capacity without a proportional increase in computation. We further develop a hierarchical gating mechanism for the MVMoE, delivering a good trade-off between empirical performance and computational complexity. Experimentally, our method significantly promotes zero-shot generalization performance on 10 unseen VRP variants, and showcases decent results on the few-shot setting and real-world benchmark instances. We further conduct extensive studies on the effect of MoE configurations in solving VRPs, and observe the superiority of hierarchical gating when facing out-of-distribution data. The source code is available at: https://github.com/RoyalSkye/Routing-MVMoE.
APA
Zhou, J., Cao, Z., Wu, Y., Song, W., Ma, Y., Zhang, J. & Chi, X.. (2024). MVMoE: Multi-Task Vehicle Routing Solver with Mixture-of-Experts. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:61804-61824 Available from https://proceedings.mlr.press/v235/zhou24c.html.

Related Material