Boosted Density Estimation Remastered

Zac Cranko, Richard Nock
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:1416-1425, 2019.

Abstract

There has recently been a steady increase in the number iterative approaches to density estimation. However, an accompanying burst of formal convergence guarantees has not followed; all results pay the price of heavy assumptions which are often unrealistic or hard to check. The Generative Adversarial Network (GAN) literature — seemingly orthogonal to the aforementioned pursuit — has had the side effect of a renewed interest in variational divergence minimisation (notably $f$-GAN). We show how to combine this latter approach and the classical boosting theory in supervised learning to get the first density estimation algorithm that provably achieves geometric convergence under very weak assumptions. We do so by a trick allowing to combine classifiers as the sufficient statistics of an exponential family. Our analysis includes an improved variational characterisation of $f$-GAN.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-cranko19b, title = {Boosted Density Estimation Remastered}, author = {Cranko, Zac and Nock, Richard}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {1416--1425}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/cranko19b/cranko19b.pdf}, url = {https://proceedings.mlr.press/v97/cranko19b.html}, abstract = {There has recently been a steady increase in the number iterative approaches to density estimation. However, an accompanying burst of formal convergence guarantees has not followed; all results pay the price of heavy assumptions which are often unrealistic or hard to check. The Generative Adversarial Network (GAN) literature — seemingly orthogonal to the aforementioned pursuit — has had the side effect of a renewed interest in variational divergence minimisation (notably $f$-GAN). We show how to combine this latter approach and the classical boosting theory in supervised learning to get the first density estimation algorithm that provably achieves geometric convergence under very weak assumptions. We do so by a trick allowing to combine classifiers as the sufficient statistics of an exponential family. Our analysis includes an improved variational characterisation of $f$-GAN.} }
Endnote
%0 Conference Paper %T Boosted Density Estimation Remastered %A Zac Cranko %A Richard Nock %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-cranko19b %I PMLR %P 1416--1425 %U https://proceedings.mlr.press/v97/cranko19b.html %V 97 %X There has recently been a steady increase in the number iterative approaches to density estimation. However, an accompanying burst of formal convergence guarantees has not followed; all results pay the price of heavy assumptions which are often unrealistic or hard to check. The Generative Adversarial Network (GAN) literature — seemingly orthogonal to the aforementioned pursuit — has had the side effect of a renewed interest in variational divergence minimisation (notably $f$-GAN). We show how to combine this latter approach and the classical boosting theory in supervised learning to get the first density estimation algorithm that provably achieves geometric convergence under very weak assumptions. We do so by a trick allowing to combine classifiers as the sufficient statistics of an exponential family. Our analysis includes an improved variational characterisation of $f$-GAN.
APA
Cranko, Z. & Nock, R.. (2019). Boosted Density Estimation Remastered. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:1416-1425 Available from https://proceedings.mlr.press/v97/cranko19b.html.

Related Material