Partially Exchangeable Networks and Architectures for Learning Summary Statistics in Approximate Bayesian Computation

Samuel Wiqvist, Pierre-Alexandre Mattei, Umberto Picchini, Jes Frellsen
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:6798-6807, 2019.

Abstract

We present a novel family of deep neural architectures, named partially exchangeable networks (PENs) that leverage probabilistic symmetries. By design, PENs are invariant to block-switch transformations, which characterize the partial exchangeability properties of conditionally Markovian processes. Moreover, we show that any block-switch invariant function has a PEN-like representation. The DeepSets architecture is a special case of PEN and we can therefore also target fully exchangeable data. We employ PENs to learn summary statistics in approximate Bayesian computation (ABC). When comparing PENs to previous deep learning methods for learning summary statistics, our results are highly competitive, both considering time series and static models. Indeed, PENs provide more reliable posterior samples even when using less training data.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-wiqvist19a, title = {Partially Exchangeable Networks and Architectures for Learning Summary Statistics in Approximate {B}ayesian Computation}, author = {Wiqvist, Samuel and Mattei, Pierre-Alexandre and Picchini, Umberto and Frellsen, Jes}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {6798--6807}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/wiqvist19a/wiqvist19a.pdf}, url = {https://proceedings.mlr.press/v97/wiqvist19a.html}, abstract = {We present a novel family of deep neural architectures, named partially exchangeable networks (PENs) that leverage probabilistic symmetries. By design, PENs are invariant to block-switch transformations, which characterize the partial exchangeability properties of conditionally Markovian processes. Moreover, we show that any block-switch invariant function has a PEN-like representation. The DeepSets architecture is a special case of PEN and we can therefore also target fully exchangeable data. We employ PENs to learn summary statistics in approximate Bayesian computation (ABC). When comparing PENs to previous deep learning methods for learning summary statistics, our results are highly competitive, both considering time series and static models. Indeed, PENs provide more reliable posterior samples even when using less training data.} }
Endnote
%0 Conference Paper %T Partially Exchangeable Networks and Architectures for Learning Summary Statistics in Approximate Bayesian Computation %A Samuel Wiqvist %A Pierre-Alexandre Mattei %A Umberto Picchini %A Jes Frellsen %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-wiqvist19a %I PMLR %P 6798--6807 %U https://proceedings.mlr.press/v97/wiqvist19a.html %V 97 %X We present a novel family of deep neural architectures, named partially exchangeable networks (PENs) that leverage probabilistic symmetries. By design, PENs are invariant to block-switch transformations, which characterize the partial exchangeability properties of conditionally Markovian processes. Moreover, we show that any block-switch invariant function has a PEN-like representation. The DeepSets architecture is a special case of PEN and we can therefore also target fully exchangeable data. We employ PENs to learn summary statistics in approximate Bayesian computation (ABC). When comparing PENs to previous deep learning methods for learning summary statistics, our results are highly competitive, both considering time series and static models. Indeed, PENs provide more reliable posterior samples even when using less training data.
APA
Wiqvist, S., Mattei, P., Picchini, U. & Frellsen, J.. (2019). Partially Exchangeable Networks and Architectures for Learning Summary Statistics in Approximate Bayesian Computation. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:6798-6807 Available from https://proceedings.mlr.press/v97/wiqvist19a.html.

Related Material