Using Perturbation to Improve Goodness-of-Fit Tests based on Kernelized Stein Discrepancy

Xing Liu, Andrew B. Duncan, Axel Gandy
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:21527-21547, 2023.

Abstract

Kernelized Stein discrepancy (KSD) is a score-based discrepancy widely used in goodness-of-fit tests. It can be applied even when the target distribution has an unknown normalising factor, such as in Bayesian analysis. We show theoretically and empirically that the KSD test can suffer from low power when the target and the alternative distributions have the same well-separated modes but differ in mixing proportions. We propose to perturb the observed sample via Markov transition kernels, with respect to which the target distribution is invariant. This allows us to then employ the KSD test on the perturbed sample. We provide numerical evidence that with suitably chosen transition kernels the proposed approach can lead to substantially higher power than the KSD test.

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-liu23i, title = {Using Perturbation to Improve Goodness-of-Fit Tests based on Kernelized Stein Discrepancy}, author = {Liu, Xing and Duncan, Andrew B. and Gandy, Axel}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {21527--21547}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/liu23i/liu23i.pdf}, url = {https://proceedings.mlr.press/v202/liu23i.html}, abstract = {Kernelized Stein discrepancy (KSD) is a score-based discrepancy widely used in goodness-of-fit tests. It can be applied even when the target distribution has an unknown normalising factor, such as in Bayesian analysis. We show theoretically and empirically that the KSD test can suffer from low power when the target and the alternative distributions have the same well-separated modes but differ in mixing proportions. We propose to perturb the observed sample via Markov transition kernels, with respect to which the target distribution is invariant. This allows us to then employ the KSD test on the perturbed sample. We provide numerical evidence that with suitably chosen transition kernels the proposed approach can lead to substantially higher power than the KSD test.} }
Endnote
%0 Conference Paper %T Using Perturbation to Improve Goodness-of-Fit Tests based on Kernelized Stein Discrepancy %A Xing Liu %A Andrew B. Duncan %A Axel Gandy %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-liu23i %I PMLR %P 21527--21547 %U https://proceedings.mlr.press/v202/liu23i.html %V 202 %X Kernelized Stein discrepancy (KSD) is a score-based discrepancy widely used in goodness-of-fit tests. It can be applied even when the target distribution has an unknown normalising factor, such as in Bayesian analysis. We show theoretically and empirically that the KSD test can suffer from low power when the target and the alternative distributions have the same well-separated modes but differ in mixing proportions. We propose to perturb the observed sample via Markov transition kernels, with respect to which the target distribution is invariant. This allows us to then employ the KSD test on the perturbed sample. We provide numerical evidence that with suitably chosen transition kernels the proposed approach can lead to substantially higher power than the KSD test.
APA
Liu, X., Duncan, A.B. & Gandy, A.. (2023). Using Perturbation to Improve Goodness-of-Fit Tests based on Kernelized Stein Discrepancy. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:21527-21547 Available from https://proceedings.mlr.press/v202/liu23i.html.

Related Material