Open Problem: Efficient Online Sparse Regression

Satyen Kale
; Proceedings of The 27th Conference on Learning Theory, PMLR 35:1299-1301, 2014.

Abstract

In practical scenarios, it is often necessary to be able to make predictions with very limited access to the features of any example. We provide one natural formulation as an online sparse regression problem with squared loss, and ask whether it is possible to achieve sublinear regret with efficient algorithms (i.e. polynomial running time in the natural parameters of the problem).

Cite this Paper


BibTeX
@InProceedings{pmlr-v35-kale14b, title = {Open Problem: Efficient Online Sparse Regression}, author = {Satyen Kale}, booktitle = {Proceedings of The 27th Conference on Learning Theory}, pages = {1299--1301}, year = {2014}, editor = {Maria Florina Balcan and Vitaly Feldman and Csaba Szepesvári}, volume = {35}, series = {Proceedings of Machine Learning Research}, address = {Barcelona, Spain}, month = {13--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v35/kale14b.pdf}, url = {http://proceedings.mlr.press/v35/kale14b.html}, abstract = {In practical scenarios, it is often necessary to be able to make predictions with very limited access to the features of any example. We provide one natural formulation as an online sparse regression problem with squared loss, and ask whether it is possible to achieve sublinear regret with efficient algorithms (i.e. polynomial running time in the natural parameters of the problem).} }
Endnote
%0 Conference Paper %T Open Problem: Efficient Online Sparse Regression %A Satyen Kale %B Proceedings of The 27th Conference on Learning Theory %C Proceedings of Machine Learning Research %D 2014 %E Maria Florina Balcan %E Vitaly Feldman %E Csaba Szepesvári %F pmlr-v35-kale14b %I PMLR %J Proceedings of Machine Learning Research %P 1299--1301 %U http://proceedings.mlr.press %V 35 %W PMLR %X In practical scenarios, it is often necessary to be able to make predictions with very limited access to the features of any example. We provide one natural formulation as an online sparse regression problem with squared loss, and ask whether it is possible to achieve sublinear regret with efficient algorithms (i.e. polynomial running time in the natural parameters of the problem).
RIS
TY - CPAPER TI - Open Problem: Efficient Online Sparse Regression AU - Satyen Kale BT - Proceedings of The 27th Conference on Learning Theory PY - 2014/05/29 DA - 2014/05/29 ED - Maria Florina Balcan ED - Vitaly Feldman ED - Csaba Szepesvári ID - pmlr-v35-kale14b PB - PMLR SP - 1299 DP - PMLR EP - 1301 L1 - http://proceedings.mlr.press/v35/kale14b.pdf UR - http://proceedings.mlr.press/v35/kale14b.html AB - In practical scenarios, it is often necessary to be able to make predictions with very limited access to the features of any example. We provide one natural formulation as an online sparse regression problem with squared loss, and ask whether it is possible to achieve sublinear regret with efficient algorithms (i.e. polynomial running time in the natural parameters of the problem). ER -
APA
Kale, S.. (2014). Open Problem: Efficient Online Sparse Regression. Proceedings of The 27th Conference on Learning Theory, in PMLR 35:1299-1301

Related Material