Score-Guided Intermediate Level Optimization: Fast Langevin Mixing for Inverse Problems

Giannis Daras, Yuval Dagan, Alex Dimakis, Constantinos Daskalakis
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:4722-4753, 2022.

Abstract

We prove fast mixing and characterize the stationary distribution of the Langevin Algorithm for inverting random weighted DNN generators. This result extends the work of Hand and Voroninski from efficient inversion to efficient posterior sampling. In practice, to allow for increased expressivity, we propose to do posterior sampling in the latent space of a pre-trained generative model. To achieve that, we train a score-based model in the latent space of a StyleGAN-2 and we use it to solve inverse problems. Our framework, Score-Guided Intermediate Layer Optimization (SGILO), extends prior work by replacing the sparsity regularization with a generative prior in the intermediate layer. Experimentally, we obtain significant improvements over the previous state-of-the-art, especially in the low measurement regime.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-daras22a, title = {Score-Guided Intermediate Level Optimization: Fast {L}angevin Mixing for Inverse Problems}, author = {Daras, Giannis and Dagan, Yuval and Dimakis, Alex and Daskalakis, Constantinos}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {4722--4753}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/daras22a/daras22a.pdf}, url = {https://proceedings.mlr.press/v162/daras22a.html}, abstract = {We prove fast mixing and characterize the stationary distribution of the Langevin Algorithm for inverting random weighted DNN generators. This result extends the work of Hand and Voroninski from efficient inversion to efficient posterior sampling. In practice, to allow for increased expressivity, we propose to do posterior sampling in the latent space of a pre-trained generative model. To achieve that, we train a score-based model in the latent space of a StyleGAN-2 and we use it to solve inverse problems. Our framework, Score-Guided Intermediate Layer Optimization (SGILO), extends prior work by replacing the sparsity regularization with a generative prior in the intermediate layer. Experimentally, we obtain significant improvements over the previous state-of-the-art, especially in the low measurement regime.} }
Endnote
%0 Conference Paper %T Score-Guided Intermediate Level Optimization: Fast Langevin Mixing for Inverse Problems %A Giannis Daras %A Yuval Dagan %A Alex Dimakis %A Constantinos Daskalakis %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-daras22a %I PMLR %P 4722--4753 %U https://proceedings.mlr.press/v162/daras22a.html %V 162 %X We prove fast mixing and characterize the stationary distribution of the Langevin Algorithm for inverting random weighted DNN generators. This result extends the work of Hand and Voroninski from efficient inversion to efficient posterior sampling. In practice, to allow for increased expressivity, we propose to do posterior sampling in the latent space of a pre-trained generative model. To achieve that, we train a score-based model in the latent space of a StyleGAN-2 and we use it to solve inverse problems. Our framework, Score-Guided Intermediate Layer Optimization (SGILO), extends prior work by replacing the sparsity regularization with a generative prior in the intermediate layer. Experimentally, we obtain significant improvements over the previous state-of-the-art, especially in the low measurement regime.
APA
Daras, G., Dagan, Y., Dimakis, A. & Daskalakis, C.. (2022). Score-Guided Intermediate Level Optimization: Fast Langevin Mixing for Inverse Problems. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:4722-4753 Available from https://proceedings.mlr.press/v162/daras22a.html.

Related Material