Formal Limitations on the Measurement of Mutual Information

David McAllester, Karl Stratos
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:875-884, 2020.

Abstract

Measuring mutual information from finite data is difficult. Recent work has considered variational methods maximizing a lower bound. In this paper, we prove that serious statistical limitations are inherent to any method of measuring mutual information. More specifically, we show that any distribution-free high-confidence lower bound on mutual information estimated from N samples cannot be larger than O(ln N).

Cite this Paper


BibTeX
@InProceedings{pmlr-v108-mcallester20a, title = {Formal Limitations on the Measurement of Mutual Information}, author = {McAllester, David and Stratos, Karl}, booktitle = {Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics}, pages = {875--884}, year = {2020}, editor = {Chiappa, Silvia and Calandra, Roberto}, volume = {108}, series = {Proceedings of Machine Learning Research}, month = {26--28 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v108/mcallester20a/mcallester20a.pdf}, url = {https://proceedings.mlr.press/v108/mcallester20a.html}, abstract = {Measuring mutual information from finite data is difficult. Recent work has considered variational methods maximizing a lower bound. In this paper, we prove that serious statistical limitations are inherent to any method of measuring mutual information. More specifically, we show that any distribution-free high-confidence lower bound on mutual information estimated from N samples cannot be larger than O(ln N).} }
Endnote
%0 Conference Paper %T Formal Limitations on the Measurement of Mutual Information %A David McAllester %A Karl Stratos %B Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2020 %E Silvia Chiappa %E Roberto Calandra %F pmlr-v108-mcallester20a %I PMLR %P 875--884 %U https://proceedings.mlr.press/v108/mcallester20a.html %V 108 %X Measuring mutual information from finite data is difficult. Recent work has considered variational methods maximizing a lower bound. In this paper, we prove that serious statistical limitations are inherent to any method of measuring mutual information. More specifically, we show that any distribution-free high-confidence lower bound on mutual information estimated from N samples cannot be larger than O(ln N).
APA
McAllester, D. & Stratos, K.. (2020). Formal Limitations on the Measurement of Mutual Information. Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 108:875-884 Available from https://proceedings.mlr.press/v108/mcallester20a.html.

Related Material