Nonparametric Score Estimators

Yuhao Zhou, Jiaxin Shi, Jun Zhu
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:11513-11522, 2020.

Abstract

Estimating the score, i.e., the gradient of log density function, from a set of samples generated by an unknown distribution is a fundamental task in inference and learning of probabilistic models that involve flexible yet intractable densities. Kernel estimators based on Stein’s methods or score matching have shown promise, however their theoretical properties and relationships have not been fully-understood. We provide a unifying view of these estimators under the framework of regularized nonparametric regression. It allows us to analyse existing estimators and construct new ones with desirable properties by choosing different hypothesis spaces and regularizers. A unified convergence analysis is provided for such estimators. Finally, we propose score estimators based on iterative regularization that enjoy computational benefits from curl-free kernels and fast convergence.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-zhou20c, title = {Nonparametric Score Estimators}, author = {Zhou, Yuhao and Shi, Jiaxin and Zhu, Jun}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {11513--11522}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/zhou20c/zhou20c.pdf}, url = {https://proceedings.mlr.press/v119/zhou20c.html}, abstract = {Estimating the score, i.e., the gradient of log density function, from a set of samples generated by an unknown distribution is a fundamental task in inference and learning of probabilistic models that involve flexible yet intractable densities. Kernel estimators based on Stein’s methods or score matching have shown promise, however their theoretical properties and relationships have not been fully-understood. We provide a unifying view of these estimators under the framework of regularized nonparametric regression. It allows us to analyse existing estimators and construct new ones with desirable properties by choosing different hypothesis spaces and regularizers. A unified convergence analysis is provided for such estimators. Finally, we propose score estimators based on iterative regularization that enjoy computational benefits from curl-free kernels and fast convergence.} }
Endnote
%0 Conference Paper %T Nonparametric Score Estimators %A Yuhao Zhou %A Jiaxin Shi %A Jun Zhu %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-zhou20c %I PMLR %P 11513--11522 %U https://proceedings.mlr.press/v119/zhou20c.html %V 119 %X Estimating the score, i.e., the gradient of log density function, from a set of samples generated by an unknown distribution is a fundamental task in inference and learning of probabilistic models that involve flexible yet intractable densities. Kernel estimators based on Stein’s methods or score matching have shown promise, however their theoretical properties and relationships have not been fully-understood. We provide a unifying view of these estimators under the framework of regularized nonparametric regression. It allows us to analyse existing estimators and construct new ones with desirable properties by choosing different hypothesis spaces and regularizers. A unified convergence analysis is provided for such estimators. Finally, we propose score estimators based on iterative regularization that enjoy computational benefits from curl-free kernels and fast convergence.
APA
Zhou, Y., Shi, J. & Zhu, J.. (2020). Nonparametric Score Estimators. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:11513-11522 Available from https://proceedings.mlr.press/v119/zhou20c.html.

Related Material