Dimensionfree Information Concentration via ExpConcavity
[edit]
Proceedings of Algorithmic Learning Theory, PMLR 83:451469, 2018.
Abstract
Information concentration of probability measures have important implications in learning theory. Recently, it is discovered that the information content of a logconcave distribution concentrates around their differential entropy, albeit with an unpleasant dependence on the ambient dimension. In this work, we prove that if the potentials of the logconcave distribution are \emph{expconcave}, which is a central notion for fast rates in online and statistical learning, then the concentration of information can be further improved to depend only on the expconcavity parameter, and hence can be dimension independent. Central to our proof is a novel yet simple application of the variance BrascampLieb inequality. In the context of learning theory, concentration of information immediately implies highprobability results to many of the previous bounds that only hold in expectation.
Related Material


