Fast Variational Estimation of Mutual Information for Implicit and Explicit Likelihood Models

Caleb Dahlke, Sue Zheng, Jason Pacheco
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:10262-10278, 2023.

Abstract

Computing mutual information (MI) of random variables lacks a closed-form in nontrivial models. Variational MI approximations are widely used as flexible estimators for this purpose, but computing them typically requires solving a costly nonconvex optimization. We prove that a widely used class of variational MI estimators can be solved via moment matching operations in place of the numerical optimization methods that are typically required. We show that the same moment matching solution yields variational estimates for so-called “implicit” models that lack a closed form likelihood function. Furthermore, we demonstrate that this moment matching solution has multiple orders of magnitude computational speed up compared to the standard optimization based solutions. We show that theoretical results are supported by numerical evaluation in fully parameterized Gaussian mixture models and a generalized linear model with implicit likelihood due to nuisance variables. We also demonstrate on the implicit simulation-based likelihood SIR epidemiology model, where we avoid costly likelihood free inference and observe many orders of magnitude speedup.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-dahlke23a, title = {Fast Variational Estimation of Mutual Information for Implicit and Explicit Likelihood Models}, author = {Dahlke, Caleb and Zheng, Sue and Pacheco, Jason}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {10262--10278}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/dahlke23a/dahlke23a.pdf}, url = {https://proceedings.mlr.press/v206/dahlke23a.html}, abstract = {Computing mutual information (MI) of random variables lacks a closed-form in nontrivial models. Variational MI approximations are widely used as flexible estimators for this purpose, but computing them typically requires solving a costly nonconvex optimization. We prove that a widely used class of variational MI estimators can be solved via moment matching operations in place of the numerical optimization methods that are typically required. We show that the same moment matching solution yields variational estimates for so-called “implicit” models that lack a closed form likelihood function. Furthermore, we demonstrate that this moment matching solution has multiple orders of magnitude computational speed up compared to the standard optimization based solutions. We show that theoretical results are supported by numerical evaluation in fully parameterized Gaussian mixture models and a generalized linear model with implicit likelihood due to nuisance variables. We also demonstrate on the implicit simulation-based likelihood SIR epidemiology model, where we avoid costly likelihood free inference and observe many orders of magnitude speedup.} }
Endnote
%0 Conference Paper %T Fast Variational Estimation of Mutual Information for Implicit and Explicit Likelihood Models %A Caleb Dahlke %A Sue Zheng %A Jason Pacheco %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-dahlke23a %I PMLR %P 10262--10278 %U https://proceedings.mlr.press/v206/dahlke23a.html %V 206 %X Computing mutual information (MI) of random variables lacks a closed-form in nontrivial models. Variational MI approximations are widely used as flexible estimators for this purpose, but computing them typically requires solving a costly nonconvex optimization. We prove that a widely used class of variational MI estimators can be solved via moment matching operations in place of the numerical optimization methods that are typically required. We show that the same moment matching solution yields variational estimates for so-called “implicit” models that lack a closed form likelihood function. Furthermore, we demonstrate that this moment matching solution has multiple orders of magnitude computational speed up compared to the standard optimization based solutions. We show that theoretical results are supported by numerical evaluation in fully parameterized Gaussian mixture models and a generalized linear model with implicit likelihood due to nuisance variables. We also demonstrate on the implicit simulation-based likelihood SIR epidemiology model, where we avoid costly likelihood free inference and observe many orders of magnitude speedup.
APA
Dahlke, C., Zheng, S. & Pacheco, J.. (2023). Fast Variational Estimation of Mutual Information for Implicit and Explicit Likelihood Models. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:10262-10278 Available from https://proceedings.mlr.press/v206/dahlke23a.html.

Related Material