Improving Flow Matching by Aligning Flow Divergence

Yuhao Huang, Taos Transue, Shih-Hsin Wang, William M Feldman, Hong Zhang, Bao Wang
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:25813-25834, 2025.

Abstract

Conditional flow matching (CFM) stands out as an efficient, simulation-free approach for training flow-based generative models, achieving remarkable performance for data generation. However, CFM is insufficient to ensure accuracy in learning probability paths. In this paper, we introduce a new partial differential equation characterization for the error between the learned and exact probability paths, along with its solution. We show that the total variation gap between the two probability paths is bounded above by a combination of the CFM loss and an associated divergence loss. This theoretical insight leads to the design of a new objective function that simultaneously matches the flow and its divergence. Our new approach improves the performance of the flow-based generative model by a noticeable margin without sacrificing generation efficiency. We showcase the advantages of this enhanced training approach over CFM on several important benchmark tasks, including generative modeling for dynamical systems, DNA sequences, and videos. Code is available at https://github.com/Utah-Math-Data-Science/Flow_Div_Matching.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-huang25ag, title = {Improving Flow Matching by Aligning Flow Divergence}, author = {Huang, Yuhao and Transue, Taos and Wang, Shih-Hsin and Feldman, William M and Zhang, Hong and Wang, Bao}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {25813--25834}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/huang25ag/huang25ag.pdf}, url = {https://proceedings.mlr.press/v267/huang25ag.html}, abstract = {Conditional flow matching (CFM) stands out as an efficient, simulation-free approach for training flow-based generative models, achieving remarkable performance for data generation. However, CFM is insufficient to ensure accuracy in learning probability paths. In this paper, we introduce a new partial differential equation characterization for the error between the learned and exact probability paths, along with its solution. We show that the total variation gap between the two probability paths is bounded above by a combination of the CFM loss and an associated divergence loss. This theoretical insight leads to the design of a new objective function that simultaneously matches the flow and its divergence. Our new approach improves the performance of the flow-based generative model by a noticeable margin without sacrificing generation efficiency. We showcase the advantages of this enhanced training approach over CFM on several important benchmark tasks, including generative modeling for dynamical systems, DNA sequences, and videos. Code is available at https://github.com/Utah-Math-Data-Science/Flow_Div_Matching.} }
Endnote
%0 Conference Paper %T Improving Flow Matching by Aligning Flow Divergence %A Yuhao Huang %A Taos Transue %A Shih-Hsin Wang %A William M Feldman %A Hong Zhang %A Bao Wang %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-huang25ag %I PMLR %P 25813--25834 %U https://proceedings.mlr.press/v267/huang25ag.html %V 267 %X Conditional flow matching (CFM) stands out as an efficient, simulation-free approach for training flow-based generative models, achieving remarkable performance for data generation. However, CFM is insufficient to ensure accuracy in learning probability paths. In this paper, we introduce a new partial differential equation characterization for the error between the learned and exact probability paths, along with its solution. We show that the total variation gap between the two probability paths is bounded above by a combination of the CFM loss and an associated divergence loss. This theoretical insight leads to the design of a new objective function that simultaneously matches the flow and its divergence. Our new approach improves the performance of the flow-based generative model by a noticeable margin without sacrificing generation efficiency. We showcase the advantages of this enhanced training approach over CFM on several important benchmark tasks, including generative modeling for dynamical systems, DNA sequences, and videos. Code is available at https://github.com/Utah-Math-Data-Science/Flow_Div_Matching.
APA
Huang, Y., Transue, T., Wang, S., Feldman, W.M., Zhang, H. & Wang, B.. (2025). Improving Flow Matching by Aligning Flow Divergence. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:25813-25834 Available from https://proceedings.mlr.press/v267/huang25ag.html.

Related Material