Adaptive Feature Selection: Computationally Efficient Online Sparse Linear Regression under RIP

Satyen Kale, Zohar Karnin, Tengyuan Liang, Dávid Pál
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:1780-1788, 2017.

Abstract

Online sparse linear regression is an online problem where an algorithm repeatedly chooses a subset of coordinates to observe in an adversarially chosen feature vector, makes a real-valued prediction, receives the true label, and incurs the squared loss. The goal is to design an online learning algorithm with sublinear regret to the best sparse linear predictor in hindsight. Without any assumptions, this problem is known to be computationally intractable. In this paper, we make the assumption that data matrix satisfies restricted isometry property, and show that this assumption leads to computationally efficient algorithms with sublinear regret for two variants of the problem. In the first variant, the true label is generated according to a sparse linear model with additive Gaussian noise. In the second, the true label is chosen adversarially.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-kale17a, title = {Adaptive Feature Selection: Computationally Efficient Online Sparse Linear Regression under {RIP}}, author = {Satyen Kale and Zohar Karnin and Tengyuan Liang and D{\'a}vid P{\'a}l}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {1780--1788}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/kale17a/kale17a.pdf}, url = {https://proceedings.mlr.press/v70/kale17a.html}, abstract = {Online sparse linear regression is an online problem where an algorithm repeatedly chooses a subset of coordinates to observe in an adversarially chosen feature vector, makes a real-valued prediction, receives the true label, and incurs the squared loss. The goal is to design an online learning algorithm with sublinear regret to the best sparse linear predictor in hindsight. Without any assumptions, this problem is known to be computationally intractable. In this paper, we make the assumption that data matrix satisfies restricted isometry property, and show that this assumption leads to computationally efficient algorithms with sublinear regret for two variants of the problem. In the first variant, the true label is generated according to a sparse linear model with additive Gaussian noise. In the second, the true label is chosen adversarially.} }
Endnote
%0 Conference Paper %T Adaptive Feature Selection: Computationally Efficient Online Sparse Linear Regression under RIP %A Satyen Kale %A Zohar Karnin %A Tengyuan Liang %A Dávid Pál %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-kale17a %I PMLR %P 1780--1788 %U https://proceedings.mlr.press/v70/kale17a.html %V 70 %X Online sparse linear regression is an online problem where an algorithm repeatedly chooses a subset of coordinates to observe in an adversarially chosen feature vector, makes a real-valued prediction, receives the true label, and incurs the squared loss. The goal is to design an online learning algorithm with sublinear regret to the best sparse linear predictor in hindsight. Without any assumptions, this problem is known to be computationally intractable. In this paper, we make the assumption that data matrix satisfies restricted isometry property, and show that this assumption leads to computationally efficient algorithms with sublinear regret for two variants of the problem. In the first variant, the true label is generated according to a sparse linear model with additive Gaussian noise. In the second, the true label is chosen adversarially.
APA
Kale, S., Karnin, Z., Liang, T. & Pál, D.. (2017). Adaptive Feature Selection: Computationally Efficient Online Sparse Linear Regression under RIP. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:1780-1788 Available from https://proceedings.mlr.press/v70/kale17a.html.

Related Material