Open Problem: Efficient Online Sparse Regression

Satyen Kale
Proceedings of The 27th Conference on Learning Theory, PMLR 35:1299-1301, 2014.

Abstract

In practical scenarios, it is often necessary to be able to make predictions with very limited access to the features of any example. We provide one natural formulation as an online sparse regression problem with squared loss, and ask whether it is possible to achieve sublinear regret with efficient algorithms (i.e. polynomial running time in the natural parameters of the problem).

Cite this Paper


BibTeX
@InProceedings{pmlr-v35-kale14b, title = {Open Problem: Efficient Online Sparse Regression}, author = {Kale, Satyen}, booktitle = {Proceedings of The 27th Conference on Learning Theory}, pages = {1299--1301}, year = {2014}, editor = {Balcan, Maria Florina and Feldman, Vitaly and Szepesvári, Csaba}, volume = {35}, series = {Proceedings of Machine Learning Research}, address = {Barcelona, Spain}, month = {13--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v35/kale14b.pdf}, url = {https://proceedings.mlr.press/v35/kale14b.html}, abstract = {In practical scenarios, it is often necessary to be able to make predictions with very limited access to the features of any example. We provide one natural formulation as an online sparse regression problem with squared loss, and ask whether it is possible to achieve sublinear regret with efficient algorithms (i.e. polynomial running time in the natural parameters of the problem).} }
Endnote
%0 Conference Paper %T Open Problem: Efficient Online Sparse Regression %A Satyen Kale %B Proceedings of The 27th Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2014 %E Maria Florina Balcan %E Vitaly Feldman %E Csaba Szepesvári %F pmlr-v35-kale14b %I PMLR %P 1299--1301 %U https://proceedings.mlr.press/v35/kale14b.html %V 35 %X In practical scenarios, it is often necessary to be able to make predictions with very limited access to the features of any example. We provide one natural formulation as an online sparse regression problem with squared loss, and ask whether it is possible to achieve sublinear regret with efficient algorithms (i.e. polynomial running time in the natural parameters of the problem).
RIS
TY - CPAPER TI - Open Problem: Efficient Online Sparse Regression AU - Satyen Kale BT - Proceedings of The 27th Conference on Learning Theory DA - 2014/05/29 ED - Maria Florina Balcan ED - Vitaly Feldman ED - Csaba Szepesvári ID - pmlr-v35-kale14b PB - PMLR DP - Proceedings of Machine Learning Research VL - 35 SP - 1299 EP - 1301 L1 - http://proceedings.mlr.press/v35/kale14b.pdf UR - https://proceedings.mlr.press/v35/kale14b.html AB - In practical scenarios, it is often necessary to be able to make predictions with very limited access to the features of any example. We provide one natural formulation as an online sparse regression problem with squared loss, and ask whether it is possible to achieve sublinear regret with efficient algorithms (i.e. polynomial running time in the natural parameters of the problem). ER -
APA
Kale, S.. (2014). Open Problem: Efficient Online Sparse Regression. Proceedings of The 27th Conference on Learning Theory, in Proceedings of Machine Learning Research 35:1299-1301 Available from https://proceedings.mlr.press/v35/kale14b.html.

Related Material