PAC-Bayes Analysis Of Maximum Entropy Classification

John Shawe-Taylor, David Hardoon
; Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, PMLR 5:480-487, 2009.

Abstract

We extend and apply the PAC-Bayes theorem to the analysis of maximum entropy learning by considering maximum entropy classification. The theory introduces a multiple sampling technique that controls an effective margin of the bound. We further develop a dual implementation of the convex optimisation that optimises the bound. This algorithm is tested on some simple datasets and the value of the bound compared with the test error.

Cite this Paper


BibTeX
@InProceedings{pmlr-v5-shawe-taylor09a, title = {PAC-Bayes Analysis Of Maximum Entropy Classification}, author = {John Shawe-Taylor and David Hardoon}, booktitle = {Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics}, pages = {480--487}, year = {2009}, editor = {David van Dyk and Max Welling}, volume = {5}, series = {Proceedings of Machine Learning Research}, address = {Hilton Clearwater Beach Resort, Clearwater Beach, Florida USA}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v5/shawe-taylor09a/shawe-taylor09a.pdf}, url = {http://proceedings.mlr.press/v5/shawe-taylor09a.html}, abstract = {We extend and apply the PAC-Bayes theorem to the analysis of maximum entropy learning by considering maximum entropy classification. The theory introduces a multiple sampling technique that controls an effective margin of the bound. We further develop a dual implementation of the convex optimisation that optimises the bound. This algorithm is tested on some simple datasets and the value of the bound compared with the test error.} }
Endnote
%0 Conference Paper %T PAC-Bayes Analysis Of Maximum Entropy Classification %A John Shawe-Taylor %A David Hardoon %B Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2009 %E David van Dyk %E Max Welling %F pmlr-v5-shawe-taylor09a %I PMLR %J Proceedings of Machine Learning Research %P 480--487 %U http://proceedings.mlr.press %V 5 %W PMLR %X We extend and apply the PAC-Bayes theorem to the analysis of maximum entropy learning by considering maximum entropy classification. The theory introduces a multiple sampling technique that controls an effective margin of the bound. We further develop a dual implementation of the convex optimisation that optimises the bound. This algorithm is tested on some simple datasets and the value of the bound compared with the test error.
RIS
TY - CPAPER TI - PAC-Bayes Analysis Of Maximum Entropy Classification AU - John Shawe-Taylor AU - David Hardoon BT - Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics PY - 2009/04/15 DA - 2009/04/15 ED - David van Dyk ED - Max Welling ID - pmlr-v5-shawe-taylor09a PB - PMLR SP - 480 DP - PMLR EP - 487 L1 - http://proceedings.mlr.press/v5/shawe-taylor09a/shawe-taylor09a.pdf UR - http://proceedings.mlr.press/v5/shawe-taylor09a.html AB - We extend and apply the PAC-Bayes theorem to the analysis of maximum entropy learning by considering maximum entropy classification. The theory introduces a multiple sampling technique that controls an effective margin of the bound. We further develop a dual implementation of the convex optimisation that optimises the bound. This algorithm is tested on some simple datasets and the value of the bound compared with the test error. ER -
APA
Shawe-Taylor, J. & Hardoon, D.. (2009). PAC-Bayes Analysis Of Maximum Entropy Classification. Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, in PMLR 5:480-487

Related Material