Conditionally Strongly Log-Concave Generative Models

Florentin Guth, Etienne Lempereur, Joan Bruna, Stéphane Mallat
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:12224-12251, 2023.

Abstract

There is a growing gap between the impressive results of deep image generative models and classical algorithms that offer theoretical guarantees. The former suffer from mode collapse or memorization issues, limiting their application to scientific data. The latter require restrictive assumptions such as log-concavity to escape the curse of dimensionality. We partially bridge this gap by introducing conditionally strongly log-concave (CSLC) models, which factorize the data distribution into a product of conditional probability distributions that are strongly log-concave. This factorization is obtained with orthogonal projectors adapted to the data distribution. It leads to efficient parameter estimation and sampling algorithms, with theoretical guarantees, although the data distribution is not globally log-concave. We show that several challenging multiscale processes are conditionally log-concave using wavelet packet orthogonal projectors. Numerical results are shown for physical fields such as the $\varphi^4$ model and weak lensing convergence maps with higher resolution than in previous works.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-guth23a, title = {Conditionally Strongly Log-Concave Generative Models}, author = {Guth, Florentin and Lempereur, Etienne and Bruna, Joan and Mallat, St\'{e}phane}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {12224--12251}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/guth23a/guth23a.pdf}, url = {https://proceedings.mlr.press/v202/guth23a.html}, abstract = {There is a growing gap between the impressive results of deep image generative models and classical algorithms that offer theoretical guarantees. The former suffer from mode collapse or memorization issues, limiting their application to scientific data. The latter require restrictive assumptions such as log-concavity to escape the curse of dimensionality. We partially bridge this gap by introducing conditionally strongly log-concave (CSLC) models, which factorize the data distribution into a product of conditional probability distributions that are strongly log-concave. This factorization is obtained with orthogonal projectors adapted to the data distribution. It leads to efficient parameter estimation and sampling algorithms, with theoretical guarantees, although the data distribution is not globally log-concave. We show that several challenging multiscale processes are conditionally log-concave using wavelet packet orthogonal projectors. Numerical results are shown for physical fields such as the $\varphi^4$ model and weak lensing convergence maps with higher resolution than in previous works.} }
Endnote
%0 Conference Paper %T Conditionally Strongly Log-Concave Generative Models %A Florentin Guth %A Etienne Lempereur %A Joan Bruna %A Stéphane Mallat %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-guth23a %I PMLR %P 12224--12251 %U https://proceedings.mlr.press/v202/guth23a.html %V 202 %X There is a growing gap between the impressive results of deep image generative models and classical algorithms that offer theoretical guarantees. The former suffer from mode collapse or memorization issues, limiting their application to scientific data. The latter require restrictive assumptions such as log-concavity to escape the curse of dimensionality. We partially bridge this gap by introducing conditionally strongly log-concave (CSLC) models, which factorize the data distribution into a product of conditional probability distributions that are strongly log-concave. This factorization is obtained with orthogonal projectors adapted to the data distribution. It leads to efficient parameter estimation and sampling algorithms, with theoretical guarantees, although the data distribution is not globally log-concave. We show that several challenging multiscale processes are conditionally log-concave using wavelet packet orthogonal projectors. Numerical results are shown for physical fields such as the $\varphi^4$ model and weak lensing convergence maps with higher resolution than in previous works.
APA
Guth, F., Lempereur, E., Bruna, J. & Mallat, S.. (2023). Conditionally Strongly Log-Concave Generative Models. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:12224-12251 Available from https://proceedings.mlr.press/v202/guth23a.html.

Related Material