Lower Bounds on the Total Variation Distance Between Mixtures of Two Gaussians
Proceedings of The 33rd International Conference on Algorithmic Learning Theory, PMLR 167:319-341, 2022.
Mixtures of high dimensional Gaussian distributions have been studied extensively in statistics and learning theory. While the total variation distance appears naturally in the sample complexity of distribution learning, it is analytically difficult to obtain tight lower bounds for mixtures. Exploiting a connection between total variation distance and the characteristic function of the mixture, we provide fairly tight functional approximations. This enables us to derive new lower bounds on the total variation distance between two-component Gaussian mixtures with a shared covariance matrix.