GAIT: A Geometric Approach to Information Theory

Jose Gallego Posada, Ankit Vani, Max Schwarzer, Simon Lacoste-Julien
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:2601-2611, 2020.

Abstract

We advocate the use of a notion of entropy that reflects the relative abundances of the symbols in an alphabet, as well as the similarities between them. This concept was originally introduced in theoretical ecology to study the diversity of ecosystems. Based on this notion of entropy, we introduce geometry-aware counterparts for several concepts and theorems in information theory. Notably, our proposed divergence exhibits performance on par with state-of-the-art methods based on the Wasserstein distance, but enjoys a closed-form expression that can be computed efficiently. We demonstrate the versatility of our method via experiments on a broad range of domains: training generative models, computing image barycenters, approximating empirical measures and counting modes.

Cite this Paper


BibTeX
@InProceedings{pmlr-v108-posada20a, title = {GAIT: A Geometric Approach to Information Theory}, author = {Posada, Jose Gallego and Vani, Ankit and Schwarzer, Max and Lacoste-Julien, Simon}, booktitle = {Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics}, pages = {2601--2611}, year = {2020}, editor = {Chiappa, Silvia and Calandra, Roberto}, volume = {108}, series = {Proceedings of Machine Learning Research}, month = {26--28 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v108/posada20a/posada20a.pdf}, url = {http://proceedings.mlr.press/v108/posada20a.html}, abstract = {We advocate the use of a notion of entropy that reflects the relative abundances of the symbols in an alphabet, as well as the similarities between them. This concept was originally introduced in theoretical ecology to study the diversity of ecosystems. Based on this notion of entropy, we introduce geometry-aware counterparts for several concepts and theorems in information theory. Notably, our proposed divergence exhibits performance on par with state-of-the-art methods based on the Wasserstein distance, but enjoys a closed-form expression that can be computed efficiently. We demonstrate the versatility of our method via experiments on a broad range of domains: training generative models, computing image barycenters, approximating empirical measures and counting modes.} }
Endnote
%0 Conference Paper %T GAIT: A Geometric Approach to Information Theory %A Jose Gallego Posada %A Ankit Vani %A Max Schwarzer %A Simon Lacoste-Julien %B Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2020 %E Silvia Chiappa %E Roberto Calandra %F pmlr-v108-posada20a %I PMLR %P 2601--2611 %U http://proceedings.mlr.press/v108/posada20a.html %V 108 %X We advocate the use of a notion of entropy that reflects the relative abundances of the symbols in an alphabet, as well as the similarities between them. This concept was originally introduced in theoretical ecology to study the diversity of ecosystems. Based on this notion of entropy, we introduce geometry-aware counterparts for several concepts and theorems in information theory. Notably, our proposed divergence exhibits performance on par with state-of-the-art methods based on the Wasserstein distance, but enjoys a closed-form expression that can be computed efficiently. We demonstrate the versatility of our method via experiments on a broad range of domains: training generative models, computing image barycenters, approximating empirical measures and counting modes.
APA
Posada, J.G., Vani, A., Schwarzer, M. & Lacoste-Julien, S.. (2020). GAIT: A Geometric Approach to Information Theory. Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 108:2601-2611 Available from http://proceedings.mlr.press/v108/posada20a.html.

Related Material