Nonparametric Sequential Prediction While Deep Learning the Kernel

Guy Uziel
Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, PMLR 108:111-121, 2020.

Abstract

The research on online learning under stationary and ergodic processes has been mainly focused on achieving asymptotic guarantees. Although all the methods pursue the same asymptotic goal, their performance varies when handling finite sample datasets and depends heavily on which predefined density estimation method is chosen. In this paper, therefore, we propose a novel algorithm that simultaneously satisfies a short-term goal, to perform as good as the best choice in hindsight of a data-adaptive kernel, learned using a deep neural network, and a long-term goal, to achieve the same theoretical asymptotic guarantee. We present theoretical proofs for our algorithms and demonstrate the validity of our method on the online portfolio selection problem.

Cite this Paper


BibTeX
@InProceedings{pmlr-v108-uziel20b, title = {Nonparametric Sequential Prediction While Deep Learning the Kernel}, author = {Uziel, Guy}, booktitle = {Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics}, pages = {111--121}, year = {2020}, editor = {Chiappa, Silvia and Calandra, Roberto}, volume = {108}, series = {Proceedings of Machine Learning Research}, month = {26--28 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v108/uziel20b/uziel20b.pdf}, url = {https://proceedings.mlr.press/v108/uziel20b.html}, abstract = {The research on online learning under stationary and ergodic processes has been mainly focused on achieving asymptotic guarantees. Although all the methods pursue the same asymptotic goal, their performance varies when handling finite sample datasets and depends heavily on which predefined density estimation method is chosen. In this paper, therefore, we propose a novel algorithm that simultaneously satisfies a short-term goal, to perform as good as the best choice in hindsight of a data-adaptive kernel, learned using a deep neural network, and a long-term goal, to achieve the same theoretical asymptotic guarantee. We present theoretical proofs for our algorithms and demonstrate the validity of our method on the online portfolio selection problem.} }
Endnote
%0 Conference Paper %T Nonparametric Sequential Prediction While Deep Learning the Kernel %A Guy Uziel %B Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2020 %E Silvia Chiappa %E Roberto Calandra %F pmlr-v108-uziel20b %I PMLR %P 111--121 %U https://proceedings.mlr.press/v108/uziel20b.html %V 108 %X The research on online learning under stationary and ergodic processes has been mainly focused on achieving asymptotic guarantees. Although all the methods pursue the same asymptotic goal, their performance varies when handling finite sample datasets and depends heavily on which predefined density estimation method is chosen. In this paper, therefore, we propose a novel algorithm that simultaneously satisfies a short-term goal, to perform as good as the best choice in hindsight of a data-adaptive kernel, learned using a deep neural network, and a long-term goal, to achieve the same theoretical asymptotic guarantee. We present theoretical proofs for our algorithms and demonstrate the validity of our method on the online portfolio selection problem.
APA
Uziel, G.. (2020). Nonparametric Sequential Prediction While Deep Learning the Kernel. Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 108:111-121 Available from https://proceedings.mlr.press/v108/uziel20b.html.

Related Material