Learning deep kernels for exponential family densities

Li Wenliang, Danica J. Sutherland, Heiko Strathmann, Arthur Gretton
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:6737-6746, 2019.

Abstract

The kernel exponential family is a rich class of distributions, which can be fit efficiently and with statistical guarantees by score matching. Being required to choose a priori a simple kernel such as the Gaussian, however, limits its practical applicability. We provide a scheme for learning a kernel parameterized by a deep network, which can find complex location-dependent local features of the data geometry. This gives a very rich class of density models, capable of fitting complex structures on moderate-dimensional problems. Compared to deep density models fit via maximum likelihood, our approach provides a complementary set of strengths and tradeoffs: in empirical studies, the former can yield higher likelihoods, whereas the latter gives better estimates of the gradient of the log density, the score, which describes the distribution’s shape.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-wenliang19a, title = {Learning deep kernels for exponential family densities}, author = {Wenliang, Li and Sutherland, Danica J. and Strathmann, Heiko and Gretton, Arthur}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {6737--6746}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/wenliang19a/wenliang19a.pdf}, url = {https://proceedings.mlr.press/v97/wenliang19a.html}, abstract = {The kernel exponential family is a rich class of distributions, which can be fit efficiently and with statistical guarantees by score matching. Being required to choose a priori a simple kernel such as the Gaussian, however, limits its practical applicability. We provide a scheme for learning a kernel parameterized by a deep network, which can find complex location-dependent local features of the data geometry. This gives a very rich class of density models, capable of fitting complex structures on moderate-dimensional problems. Compared to deep density models fit via maximum likelihood, our approach provides a complementary set of strengths and tradeoffs: in empirical studies, the former can yield higher likelihoods, whereas the latter gives better estimates of the gradient of the log density, the score, which describes the distribution’s shape.} }
Endnote
%0 Conference Paper %T Learning deep kernels for exponential family densities %A Li Wenliang %A Danica J. Sutherland %A Heiko Strathmann %A Arthur Gretton %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-wenliang19a %I PMLR %P 6737--6746 %U https://proceedings.mlr.press/v97/wenliang19a.html %V 97 %X The kernel exponential family is a rich class of distributions, which can be fit efficiently and with statistical guarantees by score matching. Being required to choose a priori a simple kernel such as the Gaussian, however, limits its practical applicability. We provide a scheme for learning a kernel parameterized by a deep network, which can find complex location-dependent local features of the data geometry. This gives a very rich class of density models, capable of fitting complex structures on moderate-dimensional problems. Compared to deep density models fit via maximum likelihood, our approach provides a complementary set of strengths and tradeoffs: in empirical studies, the former can yield higher likelihoods, whereas the latter gives better estimates of the gradient of the log density, the score, which describes the distribution’s shape.
APA
Wenliang, L., Sutherland, D.J., Strathmann, H. & Gretton, A.. (2019). Learning deep kernels for exponential family densities. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:6737-6746 Available from https://proceedings.mlr.press/v97/wenliang19a.html.

Related Material