Symmetry-Aware GFlowNets

Hohyun Kim, Seunggeun Lee, Min-Hwan Oh
Proceedings of the 42nd International Conference on Machine Learning, PMLR 267:30334-30362, 2025.

Abstract

Generative Flow Networks (GFlowNets) offer a powerful framework for sampling graphs in proportion to their rewards. However, existing approaches suffer from systematic biases due to inaccuracies in state transition probability computations. These biases, rooted in the inherent symmetries of graphs, impact both atom-based and fragment-based generation schemes. To address this challenge, we introduce Symmetry-Aware GFlowNets (SA-GFN), a method that incorporates symmetry corrections into the learning process through reward scaling. By integrating bias correction directly into the reward structure, SA-GFN eliminates the need for explicit state transition computations. Empirical results show that SA-GFN enables unbiased sampling while enhancing diversity and consistently generating high-reward graphs that closely match the target distribution.

Cite this Paper


BibTeX
@InProceedings{pmlr-v267-kim25s, title = {Symmetry-Aware {GF}low{N}ets}, author = {Kim, Hohyun and Lee, Seunggeun and Oh, Min-Hwan}, booktitle = {Proceedings of the 42nd International Conference on Machine Learning}, pages = {30334--30362}, year = {2025}, editor = {Singh, Aarti and Fazel, Maryam and Hsu, Daniel and Lacoste-Julien, Simon and Berkenkamp, Felix and Maharaj, Tegan and Wagstaff, Kiri and Zhu, Jerry}, volume = {267}, series = {Proceedings of Machine Learning Research}, month = {13--19 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v267/main/assets/kim25s/kim25s.pdf}, url = {https://proceedings.mlr.press/v267/kim25s.html}, abstract = {Generative Flow Networks (GFlowNets) offer a powerful framework for sampling graphs in proportion to their rewards. However, existing approaches suffer from systematic biases due to inaccuracies in state transition probability computations. These biases, rooted in the inherent symmetries of graphs, impact both atom-based and fragment-based generation schemes. To address this challenge, we introduce Symmetry-Aware GFlowNets (SA-GFN), a method that incorporates symmetry corrections into the learning process through reward scaling. By integrating bias correction directly into the reward structure, SA-GFN eliminates the need for explicit state transition computations. Empirical results show that SA-GFN enables unbiased sampling while enhancing diversity and consistently generating high-reward graphs that closely match the target distribution.} }
Endnote
%0 Conference Paper %T Symmetry-Aware GFlowNets %A Hohyun Kim %A Seunggeun Lee %A Min-Hwan Oh %B Proceedings of the 42nd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Aarti Singh %E Maryam Fazel %E Daniel Hsu %E Simon Lacoste-Julien %E Felix Berkenkamp %E Tegan Maharaj %E Kiri Wagstaff %E Jerry Zhu %F pmlr-v267-kim25s %I PMLR %P 30334--30362 %U https://proceedings.mlr.press/v267/kim25s.html %V 267 %X Generative Flow Networks (GFlowNets) offer a powerful framework for sampling graphs in proportion to their rewards. However, existing approaches suffer from systematic biases due to inaccuracies in state transition probability computations. These biases, rooted in the inherent symmetries of graphs, impact both atom-based and fragment-based generation schemes. To address this challenge, we introduce Symmetry-Aware GFlowNets (SA-GFN), a method that incorporates symmetry corrections into the learning process through reward scaling. By integrating bias correction directly into the reward structure, SA-GFN eliminates the need for explicit state transition computations. Empirical results show that SA-GFN enables unbiased sampling while enhancing diversity and consistently generating high-reward graphs that closely match the target distribution.
APA
Kim, H., Lee, S. & Oh, M.. (2025). Symmetry-Aware GFlowNets. Proceedings of the 42nd International Conference on Machine Learning, in Proceedings of Machine Learning Research 267:30334-30362 Available from https://proceedings.mlr.press/v267/kim25s.html.

Related Material