Efficient learning of smooth probability functions from Bernoulli tests with guarantees

Paul Rolland, Ali Kavis, Alexander Immer, Adish Singla, Volkan Cevher
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:5459-5467, 2019.

Abstract

We study the fundamental problem of learning an unknown, smooth probability function via point-wise Bernoulli tests. We provide a scalable algorithm for efficiently solving this problem with rigorous guarantees. In particular, we prove the convergence rate of our posterior update rule to the true probability function in L2-norm. Moreover, we allow the Bernoulli tests to depend on contextual features, and provide a modified inference engine with provable guarantees for this novel setting. Numerical results show that the empirical convergence rates match the theory, and illustrate the superiority of our approach in handling contextual features over the state-of-the-art.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-rolland19a, title = {Efficient learning of smooth probability functions from Bernoulli tests with guarantees}, author = {Rolland, Paul and Kavis, Ali and Immer, Alexander and Singla, Adish and Cevher, Volkan}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {5459--5467}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/rolland19a/rolland19a.pdf}, url = {https://proceedings.mlr.press/v97/rolland19a.html}, abstract = {We study the fundamental problem of learning an unknown, smooth probability function via point-wise Bernoulli tests. We provide a scalable algorithm for efficiently solving this problem with rigorous guarantees. In particular, we prove the convergence rate of our posterior update rule to the true probability function in L2-norm. Moreover, we allow the Bernoulli tests to depend on contextual features, and provide a modified inference engine with provable guarantees for this novel setting. Numerical results show that the empirical convergence rates match the theory, and illustrate the superiority of our approach in handling contextual features over the state-of-the-art.} }
Endnote
%0 Conference Paper %T Efficient learning of smooth probability functions from Bernoulli tests with guarantees %A Paul Rolland %A Ali Kavis %A Alexander Immer %A Adish Singla %A Volkan Cevher %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-rolland19a %I PMLR %P 5459--5467 %U https://proceedings.mlr.press/v97/rolland19a.html %V 97 %X We study the fundamental problem of learning an unknown, smooth probability function via point-wise Bernoulli tests. We provide a scalable algorithm for efficiently solving this problem with rigorous guarantees. In particular, we prove the convergence rate of our posterior update rule to the true probability function in L2-norm. Moreover, we allow the Bernoulli tests to depend on contextual features, and provide a modified inference engine with provable guarantees for this novel setting. Numerical results show that the empirical convergence rates match the theory, and illustrate the superiority of our approach in handling contextual features over the state-of-the-art.
APA
Rolland, P., Kavis, A., Immer, A., Singla, A. & Cevher, V.. (2019). Efficient learning of smooth probability functions from Bernoulli tests with guarantees. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:5459-5467 Available from https://proceedings.mlr.press/v97/rolland19a.html.

Related Material