Parallel and Flexible Sampling from Autoregressive Models via Langevin Dynamics

Vivek Jayaram, John Thickstun
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:4807-4818, 2021.

Abstract

This paper introduces an alternative approach to sampling from autoregressive models. Autoregressive models are typically sampled sequentially, according to the transition dynamics defined by the model. Instead, we propose a sampling procedure that initializes a sequence with white noise and follows a Markov chain defined by Langevin dynamics on the global log-likelihood of the sequence. This approach parallelizes the sampling process and generalizes to conditional sampling. Using an autoregressive model as a Bayesian prior, we can steer the output of a generative model using a conditional likelihood or constraints. We apply these techniques to autoregressive models in the visual and audio domains, with competitive results for audio source separation, super-resolution, and inpainting.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-jayaram21b, title = {Parallel and Flexible Sampling from Autoregressive Models via Langevin Dynamics}, author = {Jayaram, Vivek and Thickstun, John}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {4807--4818}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/jayaram21b/jayaram21b.pdf}, url = {https://proceedings.mlr.press/v139/jayaram21b.html}, abstract = {This paper introduces an alternative approach to sampling from autoregressive models. Autoregressive models are typically sampled sequentially, according to the transition dynamics defined by the model. Instead, we propose a sampling procedure that initializes a sequence with white noise and follows a Markov chain defined by Langevin dynamics on the global log-likelihood of the sequence. This approach parallelizes the sampling process and generalizes to conditional sampling. Using an autoregressive model as a Bayesian prior, we can steer the output of a generative model using a conditional likelihood or constraints. We apply these techniques to autoregressive models in the visual and audio domains, with competitive results for audio source separation, super-resolution, and inpainting.} }
Endnote
%0 Conference Paper %T Parallel and Flexible Sampling from Autoregressive Models via Langevin Dynamics %A Vivek Jayaram %A John Thickstun %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-jayaram21b %I PMLR %P 4807--4818 %U https://proceedings.mlr.press/v139/jayaram21b.html %V 139 %X This paper introduces an alternative approach to sampling from autoregressive models. Autoregressive models are typically sampled sequentially, according to the transition dynamics defined by the model. Instead, we propose a sampling procedure that initializes a sequence with white noise and follows a Markov chain defined by Langevin dynamics on the global log-likelihood of the sequence. This approach parallelizes the sampling process and generalizes to conditional sampling. Using an autoregressive model as a Bayesian prior, we can steer the output of a generative model using a conditional likelihood or constraints. We apply these techniques to autoregressive models in the visual and audio domains, with competitive results for audio source separation, super-resolution, and inpainting.
APA
Jayaram, V. & Thickstun, J.. (2021). Parallel and Flexible Sampling from Autoregressive Models via Langevin Dynamics. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:4807-4818 Available from https://proceedings.mlr.press/v139/jayaram21b.html.

Related Material