[edit]
Improving Sequential Latent Variable Models with Autoregressive Flows
Proceedings of The 2nd Symposium on
Advances in Approximate Bayesian Inference, PMLR 118:1-16, 2020.
Abstract
We propose an approach for sequence modeling based on autoregressive normalizing ows. Each autoregressive transform, acting across time, serves as a moving reference frame for modeling higher-level dynamics. This technique provides a simple, general-purpose method for improving sequence modeling, with connections to existing and classical techniques. We demonstrate the proposed approach both with standalone models, as well as a part of larger sequential latent variable models. Results are presented on three benchmark video datasets, where ow-based dynamics improve log-likelihood performance over baseline models.