Measure Transport with Kernel Stein Discrepancy

Matthew Fisher, Tui Nolan, Matthew Graham, Dennis Prangle, Chris Oates
Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, PMLR 130:1054-1062, 2021.

Abstract

Measure transport underpins several recent algorithms for posterior approximation in the Bayesian context, wherein a transport map is sought to minimise the Kullback–Leibler divergence (KLD) from the posterior to the approximation. The KLD is a strong mode of convergence, requiring absolute continuity of measures and placing restrictions on which transport maps can be permitted. Here we propose to minimise a kernel Stein discrepancy (KSD) instead, requiring only that the set of transport maps is dense in an $L^2$ sense and demonstrating how this condition can be validated. The consistency of the associated posterior approximation is established and empirical results suggest that KSD is competitive and more flexible alternative to KLD for measure transport.

Cite this Paper


BibTeX
@InProceedings{pmlr-v130-fisher21a, title = { Measure Transport with Kernel Stein Discrepancy }, author = {Fisher, Matthew and Nolan, Tui and Graham, Matthew and Prangle, Dennis and Oates, Chris}, booktitle = {Proceedings of The 24th International Conference on Artificial Intelligence and Statistics}, pages = {1054--1062}, year = {2021}, editor = {Banerjee, Arindam and Fukumizu, Kenji}, volume = {130}, series = {Proceedings of Machine Learning Research}, month = {13--15 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v130/fisher21a/fisher21a.pdf}, url = {https://proceedings.mlr.press/v130/fisher21a.html}, abstract = { Measure transport underpins several recent algorithms for posterior approximation in the Bayesian context, wherein a transport map is sought to minimise the Kullback–Leibler divergence (KLD) from the posterior to the approximation. The KLD is a strong mode of convergence, requiring absolute continuity of measures and placing restrictions on which transport maps can be permitted. Here we propose to minimise a kernel Stein discrepancy (KSD) instead, requiring only that the set of transport maps is dense in an $L^2$ sense and demonstrating how this condition can be validated. The consistency of the associated posterior approximation is established and empirical results suggest that KSD is competitive and more flexible alternative to KLD for measure transport. } }
Endnote
%0 Conference Paper %T Measure Transport with Kernel Stein Discrepancy %A Matthew Fisher %A Tui Nolan %A Matthew Graham %A Dennis Prangle %A Chris Oates %B Proceedings of The 24th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2021 %E Arindam Banerjee %E Kenji Fukumizu %F pmlr-v130-fisher21a %I PMLR %P 1054--1062 %U https://proceedings.mlr.press/v130/fisher21a.html %V 130 %X Measure transport underpins several recent algorithms for posterior approximation in the Bayesian context, wherein a transport map is sought to minimise the Kullback–Leibler divergence (KLD) from the posterior to the approximation. The KLD is a strong mode of convergence, requiring absolute continuity of measures and placing restrictions on which transport maps can be permitted. Here we propose to minimise a kernel Stein discrepancy (KSD) instead, requiring only that the set of transport maps is dense in an $L^2$ sense and demonstrating how this condition can be validated. The consistency of the associated posterior approximation is established and empirical results suggest that KSD is competitive and more flexible alternative to KLD for measure transport.
APA
Fisher, M., Nolan, T., Graham, M., Prangle, D. & Oates, C.. (2021). Measure Transport with Kernel Stein Discrepancy . Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 130:1054-1062 Available from https://proceedings.mlr.press/v130/fisher21a.html.

Related Material