Improving Sequential Latent Variable Models with Autoregressive Flows

Joseph Marino, Lei Chen, Jiawei He, Stephan Mandt
Proceedings of The 2nd Symposium on Advances in Approximate Bayesian Inference, PMLR 118:1-16, 2020.

Abstract

We propose an approach for sequence modeling based on autoregressive normalizing ows. Each autoregressive transform, acting across time, serves as a moving reference frame for modeling higher-level dynamics. This technique provides a simple, general-purpose method for improving sequence modeling, with connections to existing and classical techniques. We demonstrate the proposed approach both with standalone models, as well as a part of larger sequential latent variable models. Results are presented on three benchmark video datasets, where ow-based dynamics improve log-likelihood performance over baseline models.

Cite this Paper


BibTeX
@InProceedings{pmlr-v118-marino20a, title = { Improving Sequential Latent Variable Models with Autoregressive Flows}, author = {Marino, Joseph and Chen, Lei and He, Jiawei and Mandt, Stephan}, booktitle = {Proceedings of The 2nd Symposium on Advances in Approximate Bayesian Inference}, pages = {1--16}, year = {2020}, editor = {Zhang, Cheng and Ruiz, Francisco and Bui, Thang and Dieng, Adji Bousso and Liang, Dawen}, volume = {118}, series = {Proceedings of Machine Learning Research}, month = {08 Dec}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v118/marino20a/marino20a.pdf}, url = {https://proceedings.mlr.press/v118/marino20a.html}, abstract = {We propose an approach for sequence modeling based on autoregressive normalizing ows. Each autoregressive transform, acting across time, serves as a moving reference frame for modeling higher-level dynamics. This technique provides a simple, general-purpose method for improving sequence modeling, with connections to existing and classical techniques. We demonstrate the proposed approach both with standalone models, as well as a part of larger sequential latent variable models. Results are presented on three benchmark video datasets, where ow-based dynamics improve log-likelihood performance over baseline models.} }
Endnote
%0 Conference Paper %T Improving Sequential Latent Variable Models with Autoregressive Flows %A Joseph Marino %A Lei Chen %A Jiawei He %A Stephan Mandt %B Proceedings of The 2nd Symposium on Advances in Approximate Bayesian Inference %C Proceedings of Machine Learning Research %D 2020 %E Cheng Zhang %E Francisco Ruiz %E Thang Bui %E Adji Bousso Dieng %E Dawen Liang %F pmlr-v118-marino20a %I PMLR %P 1--16 %U https://proceedings.mlr.press/v118/marino20a.html %V 118 %X We propose an approach for sequence modeling based on autoregressive normalizing ows. Each autoregressive transform, acting across time, serves as a moving reference frame for modeling higher-level dynamics. This technique provides a simple, general-purpose method for improving sequence modeling, with connections to existing and classical techniques. We demonstrate the proposed approach both with standalone models, as well as a part of larger sequential latent variable models. Results are presented on three benchmark video datasets, where ow-based dynamics improve log-likelihood performance over baseline models.
APA
Marino, J., Chen, L., He, J. & Mandt, S.. (2020). Improving Sequential Latent Variable Models with Autoregressive Flows. Proceedings of The 2nd Symposium on Advances in Approximate Bayesian Inference, in Proceedings of Machine Learning Research 118:1-16 Available from https://proceedings.mlr.press/v118/marino20a.html.

Related Material