Sample Compression for RealValued Learners
[edit]
Proceedings of the 30th International Conference on Algorithmic Learning Theory, PMLR 98:466488, 2019.
Abstract
We give an algorithmically efficient version of the
learnertocompression scheme conversion in Moran and Yehudayoff
(2016). We further extend this technique to realvalued hypotheses,
to obtain a boundedsize sample compression scheme via an efficient
reduction to a certain generic realvalued learning strategy. To our
knowledge, this is the first general compressed regression result
(regardless of efficiency or boundedness) guaranteeing uniform
approximate reconstruction. Along the way, we develop a generic
procedure for constructing weak realvalued learners out of abstract
regressors; this result is also of independent interest. In
particular, this result sheds new light on an open question of
H. Simon (1997). We show applications to two regression problems:
learning Lipschitz and boundedvariation functions.
Related Material


