PAC-Bayesian Bounds on Rate-Efficient Classifiers

Alhabib Abbas, Yiannis Andreopoulos
Proceedings of the 39th International Conference on Machine Learning, PMLR 162:1-9, 2022.

Abstract

We derive analytic bounds on the noise invariance of majority vote classifiers operating on compressed inputs. Specifically, starting from recent bounds on the true risk of majority vote classifiers, we extend the applicability of PAC-Bayesian theory to quantify the resilience of majority votes to input noise stemming from compression. The derived bounds are intuitive in binary classification settings, where they can be measured as expressions of voter differentials and voter pair agreement. By combining measures of input distortion with analytic guarantees on noise invariance, we prescribe rate-efficient machines to compress inputs without affecting subsequent classification. Our validation shows how bounding noise invariance can inform the compression stage for any majority vote classifier such that worst-case implications of bad input reconstructions are known, and inputs can be compressed to the minimum amount of information needed prior to inference.

Cite this Paper


BibTeX
@InProceedings{pmlr-v162-abbas22a, title = {{PAC}-{B}ayesian Bounds on Rate-Efficient Classifiers}, author = {Abbas, Alhabib and Andreopoulos, Yiannis}, booktitle = {Proceedings of the 39th International Conference on Machine Learning}, pages = {1--9}, year = {2022}, editor = {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan}, volume = {162}, series = {Proceedings of Machine Learning Research}, month = {17--23 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v162/abbas22a/abbas22a.pdf}, url = {https://proceedings.mlr.press/v162/abbas22a.html}, abstract = {We derive analytic bounds on the noise invariance of majority vote classifiers operating on compressed inputs. Specifically, starting from recent bounds on the true risk of majority vote classifiers, we extend the applicability of PAC-Bayesian theory to quantify the resilience of majority votes to input noise stemming from compression. The derived bounds are intuitive in binary classification settings, where they can be measured as expressions of voter differentials and voter pair agreement. By combining measures of input distortion with analytic guarantees on noise invariance, we prescribe rate-efficient machines to compress inputs without affecting subsequent classification. Our validation shows how bounding noise invariance can inform the compression stage for any majority vote classifier such that worst-case implications of bad input reconstructions are known, and inputs can be compressed to the minimum amount of information needed prior to inference.} }
Endnote
%0 Conference Paper %T PAC-Bayesian Bounds on Rate-Efficient Classifiers %A Alhabib Abbas %A Yiannis Andreopoulos %B Proceedings of the 39th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2022 %E Kamalika Chaudhuri %E Stefanie Jegelka %E Le Song %E Csaba Szepesvari %E Gang Niu %E Sivan Sabato %F pmlr-v162-abbas22a %I PMLR %P 1--9 %U https://proceedings.mlr.press/v162/abbas22a.html %V 162 %X We derive analytic bounds on the noise invariance of majority vote classifiers operating on compressed inputs. Specifically, starting from recent bounds on the true risk of majority vote classifiers, we extend the applicability of PAC-Bayesian theory to quantify the resilience of majority votes to input noise stemming from compression. The derived bounds are intuitive in binary classification settings, where they can be measured as expressions of voter differentials and voter pair agreement. By combining measures of input distortion with analytic guarantees on noise invariance, we prescribe rate-efficient machines to compress inputs without affecting subsequent classification. Our validation shows how bounding noise invariance can inform the compression stage for any majority vote classifier such that worst-case implications of bad input reconstructions are known, and inputs can be compressed to the minimum amount of information needed prior to inference.
APA
Abbas, A. & Andreopoulos, Y.. (2022). PAC-Bayesian Bounds on Rate-Efficient Classifiers. Proceedings of the 39th International Conference on Machine Learning, in Proceedings of Machine Learning Research 162:1-9 Available from https://proceedings.mlr.press/v162/abbas22a.html.

Related Material