Lower Bounds on the Total Variation Distance Between Mixtures of Two Gaussians

Sami Davies, Arya Mazumdar, Soumyabrata Pal, Cyrus Rashtchian
Proceedings of The 33rd International Conference on Algorithmic Learning Theory, PMLR 167:319-341, 2022.

Abstract

Mixtures of high dimensional Gaussian distributions have been studied extensively in statistics and learning theory. While the total variation distance appears naturally in the sample complexity of distribution learning, it is analytically difficult to obtain tight lower bounds for mixtures. Exploiting a connection between total variation distance and the characteristic function of the mixture, we provide fairly tight functional approximations. This enables us to derive new lower bounds on the total variation distance between two-component Gaussian mixtures with a shared covariance matrix.

Cite this Paper


BibTeX
@InProceedings{pmlr-v167-davies22a, title = {Lower Bounds on the Total Variation Distance Between Mixtures of Two Gaussians}, author = {Davies, Sami and Mazumdar, Arya and Pal, Soumyabrata and Rashtchian, Cyrus}, booktitle = {Proceedings of The 33rd International Conference on Algorithmic Learning Theory}, pages = {319--341}, year = {2022}, editor = {Dasgupta, Sanjoy and Haghtalab, Nika}, volume = {167}, series = {Proceedings of Machine Learning Research}, month = {29 Mar--01 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v167/davies22a/davies22a.pdf}, url = {https://proceedings.mlr.press/v167/davies22a.html}, abstract = {Mixtures of high dimensional Gaussian distributions have been studied extensively in statistics and learning theory. While the total variation distance appears naturally in the sample complexity of distribution learning, it is analytically difficult to obtain tight lower bounds for mixtures. Exploiting a connection between total variation distance and the characteristic function of the mixture, we provide fairly tight functional approximations. This enables us to derive new lower bounds on the total variation distance between two-component Gaussian mixtures with a shared covariance matrix.} }
Endnote
%0 Conference Paper %T Lower Bounds on the Total Variation Distance Between Mixtures of Two Gaussians %A Sami Davies %A Arya Mazumdar %A Soumyabrata Pal %A Cyrus Rashtchian %B Proceedings of The 33rd International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2022 %E Sanjoy Dasgupta %E Nika Haghtalab %F pmlr-v167-davies22a %I PMLR %P 319--341 %U https://proceedings.mlr.press/v167/davies22a.html %V 167 %X Mixtures of high dimensional Gaussian distributions have been studied extensively in statistics and learning theory. While the total variation distance appears naturally in the sample complexity of distribution learning, it is analytically difficult to obtain tight lower bounds for mixtures. Exploiting a connection between total variation distance and the characteristic function of the mixture, we provide fairly tight functional approximations. This enables us to derive new lower bounds on the total variation distance between two-component Gaussian mixtures with a shared covariance matrix.
APA
Davies, S., Mazumdar, A., Pal, S. & Rashtchian, C.. (2022). Lower Bounds on the Total Variation Distance Between Mixtures of Two Gaussians. Proceedings of The 33rd International Conference on Algorithmic Learning Theory, in Proceedings of Machine Learning Research 167:319-341 Available from https://proceedings.mlr.press/v167/davies22a.html.

Related Material