GAIT: A Geometric Approach to Information Theory

[edit]

Jose Gallego Posada, Ankit Vani, Max Schwarzer, Simon Lacoste-Julien ;
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:2601-2611, 2020.

Abstract

We advocate the use of a notion of entropy that reflects the relative abundances of the symbols in an alphabet, as well as the similarities between them. This concept was originally introduced in theoretical ecology to study the diversity of ecosystems. Based on this notion of entropy, we introduce geometry-aware counterparts for several concepts and theorems in information theory. Notably, our proposed divergence exhibits performance on par with state-of-the-art methods based on the Wasserstein distance, but enjoys a closed-form expression that can be computed efficiently. We demonstrate the versatility of our method via experiments on a broad range of domains: training generative models, computing image barycenters, approximating empirical measures and counting modes.

Related Material