Controlled Generation with Equivariant Variational Flow Matching

Floor Eijkelboom, Heiko Zimmermann, Sharvaree Vadgama, Erik J Bekkers, Max Welling, Christian A. Naesseth, Jan-Willem Van De Meent
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:15066-15078, 2025.

Abstract

We derive a controlled generation objective within the framework of Variational Flow Matching (VFM), which casts flow matching as a variational inference problem. We demonstrate that controlled generation can be implemented two ways: (1) by way of end-to-end training of conditional generative models, or (2) as a Bayesian inference problem, enabling post hoc control of unconditional models without retraining. Furthermore, we establish the conditions required for equivariant generation and provide an equivariant formulation of VFM tailored for molecular generation, ensuring invariance to rotations, translations, and permutations. We evaluate our approach on both uncontrolled and controlled molecular generation, achieving state-of-the-art performance on uncontrolled generation and outperforming state-of-the-art models in controlled generation, both with end-to-end training and in the Bayesian inference setting. This work strengthens the connection between flow-based generative modeling and Bayesian inference, offering a scalable and principled framework for constraint-driven and symmetry-aware generation.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-eijkelboom25a, title = {Controlled Generation with Equivariant Variational Flow Matching}, author = {Eijkelboom, Floor and Zimmermann, Heiko and Vadgama, Sharvaree and Bekkers, Erik J and Welling, Max and Naesseth, Christian A. and Van De Meent, Jan-Willem}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {15066--15078}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/eijkelboom25a/eijkelboom25a.pdf}, url = {https://proceedings.mlr.press/v267/eijkelboom25a.html}, abstract = {We derive a controlled generation objective within the framework of Variational Flow Matching (VFM), which casts flow matching as a variational inference problem. We demonstrate that controlled generation can be implemented two ways: (1) by way of end-to-end training of conditional generative models, or (2) as a Bayesian inference problem, enabling post hoc control of unconditional models without retraining. Furthermore, we establish the conditions required for equivariant generation and provide an equivariant formulation of VFM tailored for molecular generation, ensuring invariance to rotations, translations, and permutations. We evaluate our approach on both uncontrolled and controlled molecular generation, achieving state-of-the-art performance on uncontrolled generation and outperforming state-of-the-art models in controlled generation, both with end-to-end training and in the Bayesian inference setting. This work strengthens the connection between flow-based generative modeling and Bayesian inference, offering a scalable and principled framework for constraint-driven and symmetry-aware generation.} }
Endnote
%0 Conference Paper %T Controlled Generation with Equivariant Variational Flow Matching %A Floor Eijkelboom %A Heiko Zimmermann %A Sharvaree Vadgama %A Erik J Bekkers %A Max Welling %A Christian A. Naesseth %A Jan-Willem Van De Meent %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-eijkelboom25a %I PMLR %P 15066--15078 %U https://proceedings.mlr.press/v267/eijkelboom25a.html %V 267 %X We derive a controlled generation objective within the framework of Variational Flow Matching (VFM), which casts flow matching as a variational inference problem. We demonstrate that controlled generation can be implemented two ways: (1) by way of end-to-end training of conditional generative models, or (2) as a Bayesian inference problem, enabling post hoc control of unconditional models without retraining. Furthermore, we establish the conditions required for equivariant generation and provide an equivariant formulation of VFM tailored for molecular generation, ensuring invariance to rotations, translations, and permutations. We evaluate our approach on both uncontrolled and controlled molecular generation, achieving state-of-the-art performance on uncontrolled generation and outperforming state-of-the-art models in controlled generation, both with end-to-end training and in the Bayesian inference setting. This work strengthens the connection between flow-based generative modeling and Bayesian inference, offering a scalable and principled framework for constraint-driven and symmetry-aware generation.
APA
Eijkelboom, F., Zimmermann, H., Vadgama, S., Bekkers, E.J., Welling, M., Naesseth, C.A. & Van De Meent, J.. (2025). Controlled Generation with Equivariant Variational Flow Matching. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:15066-15078 Available from https://proceedings.mlr.press/v267/eijkelboom25a.html.

Related Material