On the expressivity of bi-Lipschitz normalizing flows

Alexandre Verine, Benjamin Negrevergne, Yann Chevaleyre, Fabrice Rossi
Proceedings of The 14th Asian Conference on Machine Learning, PMLR 189:1054-1069, 2023.

Abstract

An invertible function is bi-Lipschitz if both the function and its inverse have bounded Lipschitz constants. Most state-of-the-art Normalizing Flows are bi-Lipschitz by design or by training to limit numerical errors (among other things). In this paper, we discuss the expressivity of bi-Lipschitz Normalizing Flows and identify several target distributions that are difficult to approximate using such models. Then, we characterize the expressivity of bi-Lipschitz Normalizing Flows by giving several lower bounds on the Total Variation distance between these particularly unfavorable distributions and their best possible approximation. Finally, we show how to use the bounds to adjust the training parameters, and discuss potential remedies.

Cite this Paper


BibTeX
@InProceedings{pmlr-v189-verine23a, title = {On the expressivity of bi-Lipschitz normalizing flows}, author = {Verine, Alexandre and Negrevergne, Benjamin and Chevaleyre, Yann and Rossi, Fabrice}, booktitle = {Proceedings of The 14th Asian Conference on Machine Learning}, pages = {1054--1069}, year = {2023}, editor = {Khan, Emtiyaz and Gonen, Mehmet}, volume = {189}, series = {Proceedings of Machine Learning Research}, month = {12--14 Dec}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v189/verine23a/verine23a.pdf}, url = {https://proceedings.mlr.press/v189/verine23a.html}, abstract = {An invertible function is bi-Lipschitz if both the function and its inverse have bounded Lipschitz constants. Most state-of-the-art Normalizing Flows are bi-Lipschitz by design or by training to limit numerical errors (among other things). In this paper, we discuss the expressivity of bi-Lipschitz Normalizing Flows and identify several target distributions that are difficult to approximate using such models. Then, we characterize the expressivity of bi-Lipschitz Normalizing Flows by giving several lower bounds on the Total Variation distance between these particularly unfavorable distributions and their best possible approximation. Finally, we show how to use the bounds to adjust the training parameters, and discuss potential remedies.} }
Endnote
%0 Conference Paper %T On the expressivity of bi-Lipschitz normalizing flows %A Alexandre Verine %A Benjamin Negrevergne %A Yann Chevaleyre %A Fabrice Rossi %B Proceedings of The 14th Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Emtiyaz Khan %E Mehmet Gonen %F pmlr-v189-verine23a %I PMLR %P 1054--1069 %U https://proceedings.mlr.press/v189/verine23a.html %V 189 %X An invertible function is bi-Lipschitz if both the function and its inverse have bounded Lipschitz constants. Most state-of-the-art Normalizing Flows are bi-Lipschitz by design or by training to limit numerical errors (among other things). In this paper, we discuss the expressivity of bi-Lipschitz Normalizing Flows and identify several target distributions that are difficult to approximate using such models. Then, we characterize the expressivity of bi-Lipschitz Normalizing Flows by giving several lower bounds on the Total Variation distance between these particularly unfavorable distributions and their best possible approximation. Finally, we show how to use the bounds to adjust the training parameters, and discuss potential remedies.
APA
Verine, A., Negrevergne, B., Chevaleyre, Y. & Rossi, F.. (2023). On the expressivity of bi-Lipschitz normalizing flows. Proceedings of The 14th Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 189:1054-1069 Available from https://proceedings.mlr.press/v189/verine23a.html.

Related Material