Multi-objective Optimization via Wasserstein-Fisher-Rao Gradient Flow

Yinuo Ren, Tesi Xiao, Tanmay Gangwani, Anshuka Rangi, Holakou Rahmanian, Lexing Ying, Subhajit Sanyal
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:3862-3870, 2024.

Abstract

Multi-objective optimization (MOO) aims to optimize multiple, possibly conflicting objectives with widespread applications. We introduce a novel interacting particle method for MOO inspired by molecular dynamics simulations. Our approach combines overdamped Langevin and birth-death dynamics, incorporating a “dominance potential” to steer particles toward global Pareto optimality. In contrast to previous methods, our method is able to relocate dominated particles, making it particularly adept at managing Pareto fronts of complicated geometries. Our method is also theoretically grounded as a Wasserstein-Fisher-Rao gradient flow with convergence guarantees. Extensive experiments confirm that our approach outperforms state-of-the-art methods on challenging synthetic and real-world datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-ren24b, title = {Multi-objective Optimization via {W}asserstein-{F}isher-{R}ao Gradient Flow}, author = {Ren, Yinuo and Xiao, Tesi and Gangwani, Tanmay and Rangi, Anshuka and Rahmanian, Holakou and Ying, Lexing and Sanyal, Subhajit}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {3862--3870}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/ren24b/ren24b.pdf}, url = {https://proceedings.mlr.press/v238/ren24b.html}, abstract = {Multi-objective optimization (MOO) aims to optimize multiple, possibly conflicting objectives with widespread applications. We introduce a novel interacting particle method for MOO inspired by molecular dynamics simulations. Our approach combines overdamped Langevin and birth-death dynamics, incorporating a “dominance potential” to steer particles toward global Pareto optimality. In contrast to previous methods, our method is able to relocate dominated particles, making it particularly adept at managing Pareto fronts of complicated geometries. Our method is also theoretically grounded as a Wasserstein-Fisher-Rao gradient flow with convergence guarantees. Extensive experiments confirm that our approach outperforms state-of-the-art methods on challenging synthetic and real-world datasets.} }
Endnote
%0 Conference Paper %T Multi-objective Optimization via Wasserstein-Fisher-Rao Gradient Flow %A Yinuo Ren %A Tesi Xiao %A Tanmay Gangwani %A Anshuka Rangi %A Holakou Rahmanian %A Lexing Ying %A Subhajit Sanyal %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-ren24b %I PMLR %P 3862--3870 %U https://proceedings.mlr.press/v238/ren24b.html %V 238 %X Multi-objective optimization (MOO) aims to optimize multiple, possibly conflicting objectives with widespread applications. We introduce a novel interacting particle method for MOO inspired by molecular dynamics simulations. Our approach combines overdamped Langevin and birth-death dynamics, incorporating a “dominance potential” to steer particles toward global Pareto optimality. In contrast to previous methods, our method is able to relocate dominated particles, making it particularly adept at managing Pareto fronts of complicated geometries. Our method is also theoretically grounded as a Wasserstein-Fisher-Rao gradient flow with convergence guarantees. Extensive experiments confirm that our approach outperforms state-of-the-art methods on challenging synthetic and real-world datasets.
APA
Ren, Y., Xiao, T., Gangwani, T., Rangi, A., Rahmanian, H., Ying, L. & Sanyal, S.. (2024). Multi-objective Optimization via Wasserstein-Fisher-Rao Gradient Flow. Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:3862-3870 Available from https://proceedings.mlr.press/v238/ren24b.html.

Related Material