Resampling Base Distributions of Normalizing Flows

Vincent Stimper, Bernhard Schölkopf, Jose Miguel Hernandez-Lobato
Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, PMLR 151:4915-4936, 2022.

Abstract

Normalizing flows are a popular class of models for approximating probability distributions. However, their invertible nature limits their ability to model target distributions whose support have a complex topological structure, such as Boltzmann distributions. Several procedures have been proposed to solve this problem but many of them sacrifice invertibility and, thereby, tractability of the log-likelihood as well as other desirable properties. To address these limitations, we introduce a base distribution for normalizing flows based on learned rejection sampling, allowing the resulting normalizing flow to model complicated distributions without giving up bijectivity. Furthermore, we develop suitable learning algorithms using both maximizing the log-likelihood and the optimization of the Kullback-Leibler divergence, and apply them to various sample problems, i.e. approximating 2D densities, density estimation of tabular data, image generation, and modeling Boltzmann distributions. In these experiments our method is competitive with or outperforms the baselines.

Cite this Paper


BibTeX
@InProceedings{pmlr-v151-stimper22a, title = { Resampling Base Distributions of Normalizing Flows }, author = {Stimper, Vincent and Sch\"olkopf, Bernhard and Miguel Hernandez-Lobato, Jose}, booktitle = {Proceedings of The 25th International Conference on Artificial Intelligence and Statistics}, pages = {4915--4936}, year = {2022}, editor = {Camps-Valls, Gustau and Ruiz, Francisco J. R. and Valera, Isabel}, volume = {151}, series = {Proceedings of Machine Learning Research}, month = {28--30 Mar}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v151/stimper22a/stimper22a.pdf}, url = {https://proceedings.mlr.press/v151/stimper22a.html}, abstract = { Normalizing flows are a popular class of models for approximating probability distributions. However, their invertible nature limits their ability to model target distributions whose support have a complex topological structure, such as Boltzmann distributions. Several procedures have been proposed to solve this problem but many of them sacrifice invertibility and, thereby, tractability of the log-likelihood as well as other desirable properties. To address these limitations, we introduce a base distribution for normalizing flows based on learned rejection sampling, allowing the resulting normalizing flow to model complicated distributions without giving up bijectivity. Furthermore, we develop suitable learning algorithms using both maximizing the log-likelihood and the optimization of the Kullback-Leibler divergence, and apply them to various sample problems, i.e. approximating 2D densities, density estimation of tabular data, image generation, and modeling Boltzmann distributions. In these experiments our method is competitive with or outperforms the baselines. } }
Endnote
%0 Conference Paper %T Resampling Base Distributions of Normalizing Flows %A Vincent Stimper %A Bernhard Schölkopf %A Jose Miguel Hernandez-Lobato %B Proceedings of The 25th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2022 %E Gustau Camps-Valls %E Francisco J. R. Ruiz %E Isabel Valera %F pmlr-v151-stimper22a %I PMLR %P 4915--4936 %U https://proceedings.mlr.press/v151/stimper22a.html %V 151 %X Normalizing flows are a popular class of models for approximating probability distributions. However, their invertible nature limits their ability to model target distributions whose support have a complex topological structure, such as Boltzmann distributions. Several procedures have been proposed to solve this problem but many of them sacrifice invertibility and, thereby, tractability of the log-likelihood as well as other desirable properties. To address these limitations, we introduce a base distribution for normalizing flows based on learned rejection sampling, allowing the resulting normalizing flow to model complicated distributions without giving up bijectivity. Furthermore, we develop suitable learning algorithms using both maximizing the log-likelihood and the optimization of the Kullback-Leibler divergence, and apply them to various sample problems, i.e. approximating 2D densities, density estimation of tabular data, image generation, and modeling Boltzmann distributions. In these experiments our method is competitive with or outperforms the baselines.
APA
Stimper, V., Schölkopf, B. & Miguel Hernandez-Lobato, J.. (2022). Resampling Base Distributions of Normalizing Flows . Proceedings of The 25th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 151:4915-4936 Available from https://proceedings.mlr.press/v151/stimper22a.html.

Related Material