A Kernelized Stein Discrepancy for Goodness-of-fit Tests

[edit]

Qiang Liu, Jason Lee, Michael Jordan ;
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:276-284, 2016.

Abstract

We derive a new discrepancy statistic for measuring differences between two probability distributions based on combining Stein’s identity and the reproducing kernel Hilbert space theory. We apply our result to test how well a probabilistic model fits a set of observations, and derive a new class of powerful goodness-of-fit tests that are widely applicable for complex and high dimensional distributions, even for those with computationally intractable normalization constants. Both theoretical and empirical properties of our methods are studied thoroughly.

Related Material