Divergences and Risks for Multiclass Experiments

Dario García-García, Robert C. Williamson
Proceedings of the 25th Annual Conference on Learning Theory, PMLR 23:28.1-28.20, 2012.

Abstract

Csiszár’s $f$-divergence is a way to measure the similarity of two probability distributions. We study the extension of $f$-divergence to more than two distributions to measure their joint similarity. By exploiting classical results from the comparison of experiments literature we prove the resulting divergence satisfies all the same properties as the traditional binary one. Considering the multidistribution case actually makes the proofs simpler. The key to these results is a formal bridge between these multidistribution $f$-divergences and Bayes risks for multiclass classification problems.

Cite this Paper


BibTeX
@InProceedings{pmlr-v23-garcia12, title = {Divergences and Risks for Multiclass Experiments}, author = {García-García, Dario and Williamson, Robert C.}, booktitle = {Proceedings of the 25th Annual Conference on Learning Theory}, pages = {28.1--28.20}, year = {2012}, editor = {Mannor, Shie and Srebro, Nathan and Williamson, Robert C.}, volume = {23}, series = {Proceedings of Machine Learning Research}, address = {Edinburgh, Scotland}, month = {25--27 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v23/garcia12/garcia12.pdf}, url = {https://proceedings.mlr.press/v23/garcia12.html}, abstract = {Csiszár’s $f$-divergence is a way to measure the similarity of two probability distributions. We study the extension of $f$-divergence to more than two distributions to measure their joint similarity. By exploiting classical results from the comparison of experiments literature we prove the resulting divergence satisfies all the same properties as the traditional binary one. Considering the multidistribution case actually makes the proofs simpler. The key to these results is a formal bridge between these multidistribution $f$-divergences and Bayes risks for multiclass classification problems.} }
Endnote
%0 Conference Paper %T Divergences and Risks for Multiclass Experiments %A Dario García-García %A Robert C. Williamson %B Proceedings of the 25th Annual Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2012 %E Shie Mannor %E Nathan Srebro %E Robert C. Williamson %F pmlr-v23-garcia12 %I PMLR %P 28.1--28.20 %U https://proceedings.mlr.press/v23/garcia12.html %V 23 %X Csiszár’s $f$-divergence is a way to measure the similarity of two probability distributions. We study the extension of $f$-divergence to more than two distributions to measure their joint similarity. By exploiting classical results from the comparison of experiments literature we prove the resulting divergence satisfies all the same properties as the traditional binary one. Considering the multidistribution case actually makes the proofs simpler. The key to these results is a formal bridge between these multidistribution $f$-divergences and Bayes risks for multiclass classification problems.
RIS
TY - CPAPER TI - Divergences and Risks for Multiclass Experiments AU - Dario García-García AU - Robert C. Williamson BT - Proceedings of the 25th Annual Conference on Learning Theory DA - 2012/06/16 ED - Shie Mannor ED - Nathan Srebro ED - Robert C. Williamson ID - pmlr-v23-garcia12 PB - PMLR DP - Proceedings of Machine Learning Research VL - 23 SP - 28.1 EP - 28.20 L1 - http://proceedings.mlr.press/v23/garcia12/garcia12.pdf UR - https://proceedings.mlr.press/v23/garcia12.html AB - Csiszár’s $f$-divergence is a way to measure the similarity of two probability distributions. We study the extension of $f$-divergence to more than two distributions to measure their joint similarity. By exploiting classical results from the comparison of experiments literature we prove the resulting divergence satisfies all the same properties as the traditional binary one. Considering the multidistribution case actually makes the proofs simpler. The key to these results is a formal bridge between these multidistribution $f$-divergences and Bayes risks for multiclass classification problems. ER -
APA
García-García, D. & Williamson, R.C.. (2012). Divergences and Risks for Multiclass Experiments. Proceedings of the 25th Annual Conference on Learning Theory, in Proceedings of Machine Learning Research 23:28.1-28.20 Available from https://proceedings.mlr.press/v23/garcia12.html.

Related Material