Sample Complexity of Probability Divergences under Group Symmetry

Ziyu Chen, Markos Katsoulakis, Luc Rey-Bellet, Wei Zhu
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:4713-4734, 2023.

Abstract

We rigorously quantify the improvement in the sample complexity of variational divergence estimations for group-invariant distributions. In the cases of the Wasserstein-1 metric and the Lipschitz-regularized $\alpha$-divergences, the reduction of sample complexity is proportional to an ambient-dimension-dependent power of the group size. For the maximum mean discrepancy (MMD), the improvement of sample complexity is more nuanced, as it depends on not only the group size but also the choice of kernel. Numerical simulations verify our theories.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-chen23p, title = {Sample Complexity of Probability Divergences under Group Symmetry}, author = {Chen, Ziyu and Katsoulakis, Markos and Rey-Bellet, Luc and Zhu, Wei}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {4713--4734}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/chen23p/chen23p.pdf}, url = {https://proceedings.mlr.press/v202/chen23p.html}, abstract = {We rigorously quantify the improvement in the sample complexity of variational divergence estimations for group-invariant distributions. In the cases of the Wasserstein-1 metric and the Lipschitz-regularized $\alpha$-divergences, the reduction of sample complexity is proportional to an ambient-dimension-dependent power of the group size. For the maximum mean discrepancy (MMD), the improvement of sample complexity is more nuanced, as it depends on not only the group size but also the choice of kernel. Numerical simulations verify our theories.} }
Endnote
%0 Conference Paper %T Sample Complexity of Probability Divergences under Group Symmetry %A Ziyu Chen %A Markos Katsoulakis %A Luc Rey-Bellet %A Wei Zhu %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-chen23p %I PMLR %P 4713--4734 %U https://proceedings.mlr.press/v202/chen23p.html %V 202 %X We rigorously quantify the improvement in the sample complexity of variational divergence estimations for group-invariant distributions. In the cases of the Wasserstein-1 metric and the Lipschitz-regularized $\alpha$-divergences, the reduction of sample complexity is proportional to an ambient-dimension-dependent power of the group size. For the maximum mean discrepancy (MMD), the improvement of sample complexity is more nuanced, as it depends on not only the group size but also the choice of kernel. Numerical simulations verify our theories.
APA
Chen, Z., Katsoulakis, M., Rey-Bellet, L. & Zhu, W.. (2023). Sample Complexity of Probability Divergences under Group Symmetry. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:4713-4734 Available from https://proceedings.mlr.press/v202/chen23p.html.

Related Material