A Kernelized Stein Discrepancy for Goodness-of-fit Tests

Qiang Liu, Jason Lee, Michael Jordan
Proceedings of The 33rd International Conference on Machine Learning, PMLR 48:276-284, 2016.

Abstract

We derive a new discrepancy statistic for measuring differences between two probability distributions based on combining Stein’s identity and the reproducing kernel Hilbert space theory. We apply our result to test how well a probabilistic model fits a set of observations, and derive a new class of powerful goodness-of-fit tests that are widely applicable for complex and high dimensional distributions, even for those with computationally intractable normalization constants. Both theoretical and empirical properties of our methods are studied thoroughly.

Cite this Paper


BibTeX
@InProceedings{pmlr-v48-liub16, title = {A Kernelized Stein Discrepancy for Goodness-of-fit Tests}, author = {Liu, Qiang and Lee, Jason and Jordan, Michael}, booktitle = {Proceedings of The 33rd International Conference on Machine Learning}, pages = {276--284}, year = {2016}, editor = {Balcan, Maria Florina and Weinberger, Kilian Q.}, volume = {48}, series = {Proceedings of Machine Learning Research}, address = {New York, New York, USA}, month = {20--22 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v48/liub16.pdf}, url = {https://proceedings.mlr.press/v48/liub16.html}, abstract = {We derive a new discrepancy statistic for measuring differences between two probability distributions based on combining Stein’s identity and the reproducing kernel Hilbert space theory. We apply our result to test how well a probabilistic model fits a set of observations, and derive a new class of powerful goodness-of-fit tests that are widely applicable for complex and high dimensional distributions, even for those with computationally intractable normalization constants. Both theoretical and empirical properties of our methods are studied thoroughly.} }
Endnote
%0 Conference Paper %T A Kernelized Stein Discrepancy for Goodness-of-fit Tests %A Qiang Liu %A Jason Lee %A Michael Jordan %B Proceedings of The 33rd International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Maria Florina Balcan %E Kilian Q. Weinberger %F pmlr-v48-liub16 %I PMLR %P 276--284 %U https://proceedings.mlr.press/v48/liub16.html %V 48 %X We derive a new discrepancy statistic for measuring differences between two probability distributions based on combining Stein’s identity and the reproducing kernel Hilbert space theory. We apply our result to test how well a probabilistic model fits a set of observations, and derive a new class of powerful goodness-of-fit tests that are widely applicable for complex and high dimensional distributions, even for those with computationally intractable normalization constants. Both theoretical and empirical properties of our methods are studied thoroughly.
RIS
TY - CPAPER TI - A Kernelized Stein Discrepancy for Goodness-of-fit Tests AU - Qiang Liu AU - Jason Lee AU - Michael Jordan BT - Proceedings of The 33rd International Conference on Machine Learning DA - 2016/06/11 ED - Maria Florina Balcan ED - Kilian Q. Weinberger ID - pmlr-v48-liub16 PB - PMLR DP - Proceedings of Machine Learning Research VL - 48 SP - 276 EP - 284 L1 - http://proceedings.mlr.press/v48/liub16.pdf UR - https://proceedings.mlr.press/v48/liub16.html AB - We derive a new discrepancy statistic for measuring differences between two probability distributions based on combining Stein’s identity and the reproducing kernel Hilbert space theory. We apply our result to test how well a probabilistic model fits a set of observations, and derive a new class of powerful goodness-of-fit tests that are widely applicable for complex and high dimensional distributions, even for those with computationally intractable normalization constants. Both theoretical and empirical properties of our methods are studied thoroughly. ER -
APA
Liu, Q., Lee, J. & Jordan, M.. (2016). A Kernelized Stein Discrepancy for Goodness-of-fit Tests. Proceedings of The 33rd International Conference on Machine Learning, in Proceedings of Machine Learning Research 48:276-284 Available from https://proceedings.mlr.press/v48/liub16.html.

Related Material