Formal Limitations on the Measurement of Mutual Information
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:875-884, 2020.
Measuring mutual information from finite data is difficult. Recent work has considered variational methods maximizing a lower bound. In this paper, we prove that serious statistical limitations are inherent to any method of measuring mutual information. More specifically, we show that any distribution-free high-confidence lower bound on mutual information estimated from N samples cannot be larger than O(ln N).