Structured Conditional Continuous Normalizing Flows for Efficient Amortized Inference in Graphical Models

[edit]

Christian Weilbach, Boyan Beronov, Frank Wood, William Harvey ;
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:4441-4451, 2020.

Abstract

We exploit minimally faithful inversion of graphical model structures to specify sparse continuous normalizing flows (CNFs) for amortized inference. We find that the sparsity of this factorization can be exploited to reduce the numbers of parameters in the neural network, adaptive integration steps of the flow, and consequently FLOPs at both training and inference time without decreasing performance in comparison to unconstrained flows. By expressing the structure inversion as a compilation pass in a probabilistic programming language, we are able to apply it in a novel way to models as complex as convolutional neural networks. Furthermore, we extend the training objective for CNFs in the context of inference amortization to the symmetric Kullback-Leibler divergence, and demonstrate its theoretical and practical advantages.

Related Material