Structured Conditional Continuous Normalizing Flows for Efficient Amortized Inference in Graphical Models

Christian Weilbach, Boyan Beronov, Frank Wood, William Harvey
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:4441-4451, 2020.

Abstract

We exploit minimally faithful inversion of graphical model structures to specify sparse continuous normalizing flows (CNFs) for amortized inference. We find that the sparsity of this factorization can be exploited to reduce the numbers of parameters in the neural network, adaptive integration steps of the flow, and consequently FLOPs at both training and inference time without decreasing performance in comparison to unconstrained flows. By expressing the structure inversion as a compilation pass in a probabilistic programming language, we are able to apply it in a novel way to models as complex as convolutional neural networks. Furthermore, we extend the training objective for CNFs in the context of inference amortization to the symmetric Kullback-Leibler divergence, and demonstrate its theoretical and practical advantages.

Cite this Paper


BibTeX
@InProceedings{pmlr-v108-weilbach20a, title = {Structured Conditional Continuous Normalizing Flows for Efficient Amortized Inference in Graphical Models}, author = {Weilbach, Christian and Beronov, Boyan and Wood, Frank and Harvey, William}, booktitle = {Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics}, pages = {4441--4451}, year = {2020}, editor = {Chiappa, Silvia and Calandra, Roberto}, volume = {108}, series = {Proceedings of Machine Learning Research}, month = {26--28 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v108/weilbach20a/weilbach20a.pdf}, url = {https://proceedings.mlr.press/v108/weilbach20a.html}, abstract = {We exploit minimally faithful inversion of graphical model structures to specify sparse continuous normalizing flows (CNFs) for amortized inference. We find that the sparsity of this factorization can be exploited to reduce the numbers of parameters in the neural network, adaptive integration steps of the flow, and consequently FLOPs at both training and inference time without decreasing performance in comparison to unconstrained flows. By expressing the structure inversion as a compilation pass in a probabilistic programming language, we are able to apply it in a novel way to models as complex as convolutional neural networks. Furthermore, we extend the training objective for CNFs in the context of inference amortization to the symmetric Kullback-Leibler divergence, and demonstrate its theoretical and practical advantages.} }
Endnote
%0 Conference Paper %T Structured Conditional Continuous Normalizing Flows for Efficient Amortized Inference in Graphical Models %A Christian Weilbach %A Boyan Beronov %A Frank Wood %A William Harvey %B Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2020 %E Silvia Chiappa %E Roberto Calandra %F pmlr-v108-weilbach20a %I PMLR %P 4441--4451 %U https://proceedings.mlr.press/v108/weilbach20a.html %V 108 %X We exploit minimally faithful inversion of graphical model structures to specify sparse continuous normalizing flows (CNFs) for amortized inference. We find that the sparsity of this factorization can be exploited to reduce the numbers of parameters in the neural network, adaptive integration steps of the flow, and consequently FLOPs at both training and inference time without decreasing performance in comparison to unconstrained flows. By expressing the structure inversion as a compilation pass in a probabilistic programming language, we are able to apply it in a novel way to models as complex as convolutional neural networks. Furthermore, we extend the training objective for CNFs in the context of inference amortization to the symmetric Kullback-Leibler divergence, and demonstrate its theoretical and practical advantages.
APA
Weilbach, C., Beronov, B., Wood, F. & Harvey, W.. (2020). Structured Conditional Continuous Normalizing Flows for Efficient Amortized Inference in Graphical Models. Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 108:4441-4451 Available from https://proceedings.mlr.press/v108/weilbach20a.html.

Related Material