[edit]
Bregman Deviations of Generic Exponential Families
Proceedings of Thirty Sixth Conference on Learning Theory, PMLR 195:394-449, 2023.
Abstract
We revisit the method of mixtures, or Laplace method, to study the concentration phenomenon in generic (possibly multidimensional) exponential families. Using the duality properties of the Bregman divergence associated with the log-partition function of the family to construct nonnegative martingales, we establish a generic bound controlling the deviation between the parameter of the family and a finite sample estimate, expressed in the local geometry induced by the Bregman pseudo-metric. Our bound is time-uniform and involves a quantity extending the classical information gain to exponential families, which we call the Bregman information gain.For the practitioner, we instantiate this novel bound to several classical families, e.g., Gaussian (including with unknown variance or multivariate), Bernoulli, Exponential, Weibull, Pareto, Poisson and Chi-square, yielding explicit forms of the confidence sets and the Bregman information gain. We further compare the resulting confidence bounds to state-of-the-art time-uniform alternatives and show this novel method yields competitive results. Finally, we apply our result to the design of generalized likelihood ratio tests for change detection, capturing new settings such as variance change in Gaussian families.