Training Normalizing Flows from Dependent Data

Matthias Kirchler, Christoph Lippert, Marius Kloft
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:17105-17121, 2023.

Abstract

Normalizing flows are powerful non-parametric statistical models that function as a hybrid between density estimators and generative models. Current learning algorithms for normalizing flows assume that data points are sampled independently, an assumption that is frequently violated in practice, which may lead to erroneous density estimation and data generation. We propose a likelihood objective of normalizing flows incorporating dependencies between the data points, for which we derive a flexible and efficient learning algorithm suitable for different dependency structures. We show that respecting dependencies between observations can improve empirical results on both synthetic and real-world data, and leads to higher statistical power in a downstream application to genome-wide association studies.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-kirchler23a, title = {Training Normalizing Flows from Dependent Data}, author = {Kirchler, Matthias and Lippert, Christoph and Kloft, Marius}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {17105--17121}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/kirchler23a/kirchler23a.pdf}, url = {https://proceedings.mlr.press/v202/kirchler23a.html}, abstract = {Normalizing flows are powerful non-parametric statistical models that function as a hybrid between density estimators and generative models. Current learning algorithms for normalizing flows assume that data points are sampled independently, an assumption that is frequently violated in practice, which may lead to erroneous density estimation and data generation. We propose a likelihood objective of normalizing flows incorporating dependencies between the data points, for which we derive a flexible and efficient learning algorithm suitable for different dependency structures. We show that respecting dependencies between observations can improve empirical results on both synthetic and real-world data, and leads to higher statistical power in a downstream application to genome-wide association studies.} }
Endnote
%0 Conference Paper %T Training Normalizing Flows from Dependent Data %A Matthias Kirchler %A Christoph Lippert %A Marius Kloft %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-kirchler23a %I PMLR %P 17105--17121 %U https://proceedings.mlr.press/v202/kirchler23a.html %V 202 %X Normalizing flows are powerful non-parametric statistical models that function as a hybrid between density estimators and generative models. Current learning algorithms for normalizing flows assume that data points are sampled independently, an assumption that is frequently violated in practice, which may lead to erroneous density estimation and data generation. We propose a likelihood objective of normalizing flows incorporating dependencies between the data points, for which we derive a flexible and efficient learning algorithm suitable for different dependency structures. We show that respecting dependencies between observations can improve empirical results on both synthetic and real-world data, and leads to higher statistical power in a downstream application to genome-wide association studies.
APA
Kirchler, M., Lippert, C. & Kloft, M.. (2023). Training Normalizing Flows from Dependent Data. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:17105-17121 Available from https://proceedings.mlr.press/v202/kirchler23a.html.

Related Material