PAC-Bayes Analysis Of Maximum Entropy Classification

John Shawe-Taylor, David Hardoon
Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, PMLR 5:480-487, 2009.

Abstract

We extend and apply the PAC-Bayes theorem to the analysis of maximum entropy learning by considering maximum entropy classification. The theory introduces a multiple sampling technique that controls an effective margin of the bound. We further develop a dual implementation of the convex optimisation that optimises the bound. This algorithm is tested on some simple datasets and the value of the bound compared with the test error.

Cite this Paper


BibTeX
@InProceedings{pmlr-v5-shawe-taylor09a, title = {PAC-Bayes Analysis Of Maximum Entropy Classification}, author = {Shawe-Taylor, John and Hardoon, David}, booktitle = {Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics}, pages = {480--487}, year = {2009}, editor = {van Dyk, David and Welling, Max}, volume = {5}, series = {Proceedings of Machine Learning Research}, address = {Hilton Clearwater Beach Resort, Clearwater Beach, Florida USA}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v5/shawe-taylor09a/shawe-taylor09a.pdf}, url = {https://proceedings.mlr.press/v5/shawe-taylor09a.html}, abstract = {We extend and apply the PAC-Bayes theorem to the analysis of maximum entropy learning by considering maximum entropy classification. The theory introduces a multiple sampling technique that controls an effective margin of the bound. We further develop a dual implementation of the convex optimisation that optimises the bound. This algorithm is tested on some simple datasets and the value of the bound compared with the test error.} }
Endnote
%0 Conference Paper %T PAC-Bayes Analysis Of Maximum Entropy Classification %A John Shawe-Taylor %A David Hardoon %B Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2009 %E David van Dyk %E Max Welling %F pmlr-v5-shawe-taylor09a %I PMLR %P 480--487 %U https://proceedings.mlr.press/v5/shawe-taylor09a.html %V 5 %X We extend and apply the PAC-Bayes theorem to the analysis of maximum entropy learning by considering maximum entropy classification. The theory introduces a multiple sampling technique that controls an effective margin of the bound. We further develop a dual implementation of the convex optimisation that optimises the bound. This algorithm is tested on some simple datasets and the value of the bound compared with the test error.
RIS
TY - CPAPER TI - PAC-Bayes Analysis Of Maximum Entropy Classification AU - John Shawe-Taylor AU - David Hardoon BT - Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics DA - 2009/04/15 ED - David van Dyk ED - Max Welling ID - pmlr-v5-shawe-taylor09a PB - PMLR DP - Proceedings of Machine Learning Research VL - 5 SP - 480 EP - 487 L1 - http://proceedings.mlr.press/v5/shawe-taylor09a/shawe-taylor09a.pdf UR - https://proceedings.mlr.press/v5/shawe-taylor09a.html AB - We extend and apply the PAC-Bayes theorem to the analysis of maximum entropy learning by considering maximum entropy classification. The theory introduces a multiple sampling technique that controls an effective margin of the bound. We further develop a dual implementation of the convex optimisation that optimises the bound. This algorithm is tested on some simple datasets and the value of the bound compared with the test error. ER -
APA
Shawe-Taylor, J. & Hardoon, D.. (2009). PAC-Bayes Analysis Of Maximum Entropy Classification. Proceedings of the Twelth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 5:480-487 Available from https://proceedings.mlr.press/v5/shawe-taylor09a.html.

Related Material