Minimal Achievable Sufficient Statistic Learning

Milan Cvitkovic, Günther Koliander
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:1465-1474, 2019.

Abstract

We introduce Minimal Achievable Sufficient Statistic (MASS) Learning, a machine learning training objective for which the minima are minimal sufficient statistics with respect to a class of functions being optimized over (e.g., deep networks). In deriving MASS Learning, we also introduce Conserved Differential Information (CDI), an information-theoretic quantity that {—} unlike standard mutual information {—} can be usefully applied to deterministically-dependent continuous random variables like the input and output of a deep network. In a series of experiments, we show that deep networks trained with MASS Learning achieve competitive performance on supervised learning, regularization, and uncertainty quantification benchmarks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-cvitkovic19a, title = {Minimal Achievable Sufficient Statistic Learning}, author = {Cvitkovic, Milan and Koliander, G{\"u}nther}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {1465--1474}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/cvitkovic19a/cvitkovic19a.pdf}, url = {https://proceedings.mlr.press/v97/cvitkovic19a.html}, abstract = {We introduce Minimal Achievable Sufficient Statistic (MASS) Learning, a machine learning training objective for which the minima are minimal sufficient statistics with respect to a class of functions being optimized over (e.g., deep networks). In deriving MASS Learning, we also introduce Conserved Differential Information (CDI), an information-theoretic quantity that {—} unlike standard mutual information {—} can be usefully applied to deterministically-dependent continuous random variables like the input and output of a deep network. In a series of experiments, we show that deep networks trained with MASS Learning achieve competitive performance on supervised learning, regularization, and uncertainty quantification benchmarks.} }
Endnote
%0 Conference Paper %T Minimal Achievable Sufficient Statistic Learning %A Milan Cvitkovic %A Günther Koliander %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-cvitkovic19a %I PMLR %P 1465--1474 %U https://proceedings.mlr.press/v97/cvitkovic19a.html %V 97 %X We introduce Minimal Achievable Sufficient Statistic (MASS) Learning, a machine learning training objective for which the minima are minimal sufficient statistics with respect to a class of functions being optimized over (e.g., deep networks). In deriving MASS Learning, we also introduce Conserved Differential Information (CDI), an information-theoretic quantity that {—} unlike standard mutual information {—} can be usefully applied to deterministically-dependent continuous random variables like the input and output of a deep network. In a series of experiments, we show that deep networks trained with MASS Learning achieve competitive performance on supervised learning, regularization, and uncertainty quantification benchmarks.
APA
Cvitkovic, M. & Koliander, G.. (2019). Minimal Achievable Sufficient Statistic Learning. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:1465-1474 Available from https://proceedings.mlr.press/v97/cvitkovic19a.html.

Related Material