A theory of continuous generative flow networks

Salem Lahlou, Tristan Deleu, Pablo Lemos, Dinghuai Zhang, Alexandra Volokhova, Alex Hernández-Garcı́a, Lena Nehale Ezzine, Yoshua Bengio, Nikolay Malkin
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:18269-18300, 2023.

Abstract

Generative flow networks (GFlowNets) are amortized variational inference algorithms that are trained to sample from unnormalized target distributions over compositional objects. A key limitation of GFlowNets until this time has been that they are restricted to discrete spaces. We present a theory for generalized GFlowNets, which encompasses both existing discrete GFlowNets and ones with continuous or hybrid state spaces, and perform experiments with two goals in mind. First, we illustrate critical points of the theory and the importance of various assumptions. Second, we empirically demonstrate how observations about discrete GFlowNets transfer to the continuous case and show strong results compared to non-GFlowNet baselines on several previously studied tasks. This work greatly widens the perspectives for the application of GFlowNets in probabilistic inference and various modeling settings.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-lahlou23a, title = {A theory of continuous generative flow networks}, author = {Lahlou, Salem and Deleu, Tristan and Lemos, Pablo and Zhang, Dinghuai and Volokhova, Alexandra and Hern\'{a}ndez-Garc\'{\i}a, Alex and Ezzine, Lena Nehale and Bengio, Yoshua and Malkin, Nikolay}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {18269--18300}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/lahlou23a/lahlou23a.pdf}, url = {https://proceedings.mlr.press/v202/lahlou23a.html}, abstract = {Generative flow networks (GFlowNets) are amortized variational inference algorithms that are trained to sample from unnormalized target distributions over compositional objects. A key limitation of GFlowNets until this time has been that they are restricted to discrete spaces. We present a theory for generalized GFlowNets, which encompasses both existing discrete GFlowNets and ones with continuous or hybrid state spaces, and perform experiments with two goals in mind. First, we illustrate critical points of the theory and the importance of various assumptions. Second, we empirically demonstrate how observations about discrete GFlowNets transfer to the continuous case and show strong results compared to non-GFlowNet baselines on several previously studied tasks. This work greatly widens the perspectives for the application of GFlowNets in probabilistic inference and various modeling settings.} }
Endnote
%0 Conference Paper %T A theory of continuous generative flow networks %A Salem Lahlou %A Tristan Deleu %A Pablo Lemos %A Dinghuai Zhang %A Alexandra Volokhova %A Alex Hernández-Garcı́a %A Lena Nehale Ezzine %A Yoshua Bengio %A Nikolay Malkin %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-lahlou23a %I PMLR %P 18269--18300 %U https://proceedings.mlr.press/v202/lahlou23a.html %V 202 %X Generative flow networks (GFlowNets) are amortized variational inference algorithms that are trained to sample from unnormalized target distributions over compositional objects. A key limitation of GFlowNets until this time has been that they are restricted to discrete spaces. We present a theory for generalized GFlowNets, which encompasses both existing discrete GFlowNets and ones with continuous or hybrid state spaces, and perform experiments with two goals in mind. First, we illustrate critical points of the theory and the importance of various assumptions. Second, we empirically demonstrate how observations about discrete GFlowNets transfer to the continuous case and show strong results compared to non-GFlowNet baselines on several previously studied tasks. This work greatly widens the perspectives for the application of GFlowNets in probabilistic inference and various modeling settings.
APA
Lahlou, S., Deleu, T., Lemos, P., Zhang, D., Volokhova, A., Hernández-Garcı́a, A., Ezzine, L.N., Bengio, Y. & Malkin, N.. (2023). A theory of continuous generative flow networks. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:18269-18300 Available from https://proceedings.mlr.press/v202/lahlou23a.html.

Related Material