Semi-Autoregressive Energy Flows: Exploring Likelihood-Free Training of Normalizing Flows

Phillip Si, Zeyi Chen, Subham Sekhar Sahoo, Yair Schiff, Volodymyr Kuleshov
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:31732-31753, 2023.

Abstract

Training normalizing flow generative models can be challenging due to the need to calculate computationally expensive determinants of Jacobians. This paper studies the likelihood-free training of flows and proposes the energy objective, an alternative sample-based loss based on proper scoring rules. The energy objective is determinant-free and supports flexible model architectures that are not easily compatible with maximum likelihood training, including semi-autoregressive energy flows, a novel model family that interpolates between fully autoregressive and non-autoregressive models. Energy flows feature competitive sample quality, posterior inference, and generation speed relative to likelihood-based flows; this performance is decorrelated from the quality of log-likelihood estimates, which are generally very poor. Our findings question the use of maximum likelihood as an objective or a metric, and contribute to a scientific study of its role in generative modeling. Code is available at https://github.com/ps789/SAEF.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-si23a, title = {Semi-Autoregressive Energy Flows: Exploring Likelihood-Free Training of Normalizing Flows}, author = {Si, Phillip and Chen, Zeyi and Sahoo, Subham Sekhar and Schiff, Yair and Kuleshov, Volodymyr}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {31732--31753}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/si23a/si23a.pdf}, url = {https://proceedings.mlr.press/v202/si23a.html}, abstract = {Training normalizing flow generative models can be challenging due to the need to calculate computationally expensive determinants of Jacobians. This paper studies the likelihood-free training of flows and proposes the energy objective, an alternative sample-based loss based on proper scoring rules. The energy objective is determinant-free and supports flexible model architectures that are not easily compatible with maximum likelihood training, including semi-autoregressive energy flows, a novel model family that interpolates between fully autoregressive and non-autoregressive models. Energy flows feature competitive sample quality, posterior inference, and generation speed relative to likelihood-based flows; this performance is decorrelated from the quality of log-likelihood estimates, which are generally very poor. Our findings question the use of maximum likelihood as an objective or a metric, and contribute to a scientific study of its role in generative modeling. Code is available at https://github.com/ps789/SAEF.} }
Endnote
%0 Conference Paper %T Semi-Autoregressive Energy Flows: Exploring Likelihood-Free Training of Normalizing Flows %A Phillip Si %A Zeyi Chen %A Subham Sekhar Sahoo %A Yair Schiff %A Volodymyr Kuleshov %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-si23a %I PMLR %P 31732--31753 %U https://proceedings.mlr.press/v202/si23a.html %V 202 %X Training normalizing flow generative models can be challenging due to the need to calculate computationally expensive determinants of Jacobians. This paper studies the likelihood-free training of flows and proposes the energy objective, an alternative sample-based loss based on proper scoring rules. The energy objective is determinant-free and supports flexible model architectures that are not easily compatible with maximum likelihood training, including semi-autoregressive energy flows, a novel model family that interpolates between fully autoregressive and non-autoregressive models. Energy flows feature competitive sample quality, posterior inference, and generation speed relative to likelihood-based flows; this performance is decorrelated from the quality of log-likelihood estimates, which are generally very poor. Our findings question the use of maximum likelihood as an objective or a metric, and contribute to a scientific study of its role in generative modeling. Code is available at https://github.com/ps789/SAEF.
APA
Si, P., Chen, Z., Sahoo, S.S., Schiff, Y. & Kuleshov, V.. (2023). Semi-Autoregressive Energy Flows: Exploring Likelihood-Free Training of Normalizing Flows. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:31732-31753 Available from https://proceedings.mlr.press/v202/si23a.html.

Related Material