Memoryless Sequences for Differentiable Losses

Rafael Frongillo, Andrew Nobel
Proceedings of the 2017 Conference on Learning Theory, PMLR 65:925-939, 2017.

Abstract

One way to define the “randomness” of a fixed individual sequence is to ask how hard it is to predict. When prediction error is measured via squared loss, it has been established that memoryless sequences (which are, in a precise sense, hard to predict) have some of the stochastic attributes of truly random sequences. In this paper, we ask how changing the loss function used changes the set of memoryless sequences, and in particular, the stochastic attributes they possess. We answer this question for differentiable convex loss functions using tools from property elicitation, showing that the property elicited by the loss determines the stochastic attributes of the corresponding memoryless sequences. We apply our results to price calibration in prediction markets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v65-frongillo17a, title = {Memoryless Sequences for Differentiable Losses}, author = {Frongillo, Rafael and Nobel, Andrew}, booktitle = {Proceedings of the 2017 Conference on Learning Theory}, pages = {925--939}, year = {2017}, editor = {Kale, Satyen and Shamir, Ohad}, volume = {65}, series = {Proceedings of Machine Learning Research}, month = {07--10 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v65/frongillo17a/frongillo17a.pdf}, url = {https://proceedings.mlr.press/v65/frongillo17a.html}, abstract = {One way to define the “randomness” of a fixed individual sequence is to ask how hard it is to predict. When prediction error is measured via squared loss, it has been established that memoryless sequences (which are, in a precise sense, hard to predict) have some of the stochastic attributes of truly random sequences. In this paper, we ask how changing the loss function used changes the set of memoryless sequences, and in particular, the stochastic attributes they possess. We answer this question for differentiable convex loss functions using tools from property elicitation, showing that the property elicited by the loss determines the stochastic attributes of the corresponding memoryless sequences. We apply our results to price calibration in prediction markets.} }
Endnote
%0 Conference Paper %T Memoryless Sequences for Differentiable Losses %A Rafael Frongillo %A Andrew Nobel %B Proceedings of the 2017 Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2017 %E Satyen Kale %E Ohad Shamir %F pmlr-v65-frongillo17a %I PMLR %P 925--939 %U https://proceedings.mlr.press/v65/frongillo17a.html %V 65 %X One way to define the “randomness” of a fixed individual sequence is to ask how hard it is to predict. When prediction error is measured via squared loss, it has been established that memoryless sequences (which are, in a precise sense, hard to predict) have some of the stochastic attributes of truly random sequences. In this paper, we ask how changing the loss function used changes the set of memoryless sequences, and in particular, the stochastic attributes they possess. We answer this question for differentiable convex loss functions using tools from property elicitation, showing that the property elicited by the loss determines the stochastic attributes of the corresponding memoryless sequences. We apply our results to price calibration in prediction markets.
APA
Frongillo, R. & Nobel, A.. (2017). Memoryless Sequences for Differentiable Losses. Proceedings of the 2017 Conference on Learning Theory, in Proceedings of Machine Learning Research 65:925-939 Available from https://proceedings.mlr.press/v65/frongillo17a.html.

Related Material