Free-form Flows: Make Any Architecture a Normalizing Flow

Felix Draxler, Peter Sorrenson, Lea Zimmermann, Armand Rousselot, Ullrich Köthe
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:2197-2205, 2024.

Abstract

Normalizing Flows are generative models that directly maximize the likelihood. Previously, the design of normalizing flows was largely constrained by the need for analytical invertibility. We overcome this constraint by a training procedure that uses an efficient estimator for the gradient of the change of variables formula. This enables any dimension-preserving neural network to serve as a generative model through maximum likelihood training. Our approach allows placing the emphasis on tailoring inductive biases precisely to the task at hand. Specifically, we achieve excellent results in molecule generation benchmarks utilizing E(n)-equivariant networks at greatly improved sampling speed. Moreover, our method is competitive in an inverse problem benchmark, while employing off-the-shelf ResNet architectures. We publish our code at https://github.com/vislearn/FFF.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-draxler24a, title = { Free-form Flows: Make Any Architecture a Normalizing Flow }, author = {Draxler, Felix and Sorrenson, Peter and Zimmermann, Lea and Rousselot, Armand and K\"{o}the, Ullrich}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {2197--2205}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/draxler24a/draxler24a.pdf}, url = {https://proceedings.mlr.press/v238/draxler24a.html}, abstract = { Normalizing Flows are generative models that directly maximize the likelihood. Previously, the design of normalizing flows was largely constrained by the need for analytical invertibility. We overcome this constraint by a training procedure that uses an efficient estimator for the gradient of the change of variables formula. This enables any dimension-preserving neural network to serve as a generative model through maximum likelihood training. Our approach allows placing the emphasis on tailoring inductive biases precisely to the task at hand. Specifically, we achieve excellent results in molecule generation benchmarks utilizing E(n)-equivariant networks at greatly improved sampling speed. Moreover, our method is competitive in an inverse problem benchmark, while employing off-the-shelf ResNet architectures. We publish our code at https://github.com/vislearn/FFF. } }
Endnote
%0 Conference Paper %T Free-form Flows: Make Any Architecture a Normalizing Flow %A Felix Draxler %A Peter Sorrenson %A Lea Zimmermann %A Armand Rousselot %A Ullrich Köthe %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-draxler24a %I PMLR %P 2197--2205 %U https://proceedings.mlr.press/v238/draxler24a.html %V 238 %X Normalizing Flows are generative models that directly maximize the likelihood. Previously, the design of normalizing flows was largely constrained by the need for analytical invertibility. We overcome this constraint by a training procedure that uses an efficient estimator for the gradient of the change of variables formula. This enables any dimension-preserving neural network to serve as a generative model through maximum likelihood training. Our approach allows placing the emphasis on tailoring inductive biases precisely to the task at hand. Specifically, we achieve excellent results in molecule generation benchmarks utilizing E(n)-equivariant networks at greatly improved sampling speed. Moreover, our method is competitive in an inverse problem benchmark, while employing off-the-shelf ResNet architectures. We publish our code at https://github.com/vislearn/FFF.
APA
Draxler, F., Sorrenson, P., Zimmermann, L., Rousselot, A. & Köthe, U.. (2024). Free-form Flows: Make Any Architecture a Normalizing Flow . Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:2197-2205 Available from https://proceedings.mlr.press/v238/draxler24a.html.

Related Material