Attribute-efficient learning of monomials over highly-correlated variables

Alexandr Andoni, Rishabh Dudeja, Daniel Hsu, Kiran Vodrahalli
Proceedings of the 30th International Conference on Algorithmic Learning Theory, PMLR 98:127-161, 2019.

Abstract

We study the problem of learning a real-valued function of correlated variables. Solving this problem is of interest since many classical learning results apply only in the case of learning functions of random variables that are independent. We show how to recover a high-dimensional, sparse monomial model from Gaussian examples with sample complexity that is poly-logarithmic in the total number of variables and polynomial in the number of relevant variables. Our algorithm is based on a transformation of the variables—taking their logarithm—followed by a sparse linear regression procedure, which is statistically and computationally efficient. While this transformation is commonly used in applied non-linear regression, its statistical guarantees have never been rigorously analyzed. We prove that the sparse regression procedure succeeds even in cases where the original features are highly correlated and fail to satisfy the standard assumptions required for sparse linear regression.

Cite this Paper


BibTeX
@InProceedings{pmlr-v98-andoni19a, title = {Attribute-efficient learning of monomials over highly-correlated variables}, author = {Andoni, Alexandr and Dudeja, Rishabh and Hsu, Daniel and Vodrahalli, Kiran}, booktitle = {Proceedings of the 30th International Conference on Algorithmic Learning Theory}, pages = {127--161}, year = {2019}, editor = {Garivier, Aurélien and Kale, Satyen}, volume = {98}, series = {Proceedings of Machine Learning Research}, month = {22--24 Mar}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v98/andoni19a/andoni19a.pdf}, url = {https://proceedings.mlr.press/v98/andoni19a.html}, abstract = {We study the problem of learning a real-valued function of correlated variables. Solving this problem is of interest since many classical learning results apply only in the case of learning functions of random variables that are independent. We show how to recover a high-dimensional, sparse monomial model from Gaussian examples with sample complexity that is poly-logarithmic in the total number of variables and polynomial in the number of relevant variables. Our algorithm is based on a transformation of the variables—taking their logarithm—followed by a sparse linear regression procedure, which is statistically and computationally efficient. While this transformation is commonly used in applied non-linear regression, its statistical guarantees have never been rigorously analyzed. We prove that the sparse regression procedure succeeds even in cases where the original features are highly correlated and fail to satisfy the standard assumptions required for sparse linear regression.} }
Endnote
%0 Conference Paper %T Attribute-efficient learning of monomials over highly-correlated variables %A Alexandr Andoni %A Rishabh Dudeja %A Daniel Hsu %A Kiran Vodrahalli %B Proceedings of the 30th International Conference on Algorithmic Learning Theory %C Proceedings of Machine Learning Research %D 2019 %E Aurélien Garivier %E Satyen Kale %F pmlr-v98-andoni19a %I PMLR %P 127--161 %U https://proceedings.mlr.press/v98/andoni19a.html %V 98 %X We study the problem of learning a real-valued function of correlated variables. Solving this problem is of interest since many classical learning results apply only in the case of learning functions of random variables that are independent. We show how to recover a high-dimensional, sparse monomial model from Gaussian examples with sample complexity that is poly-logarithmic in the total number of variables and polynomial in the number of relevant variables. Our algorithm is based on a transformation of the variables—taking their logarithm—followed by a sparse linear regression procedure, which is statistically and computationally efficient. While this transformation is commonly used in applied non-linear regression, its statistical guarantees have never been rigorously analyzed. We prove that the sparse regression procedure succeeds even in cases where the original features are highly correlated and fail to satisfy the standard assumptions required for sparse linear regression.
APA
Andoni, A., Dudeja, R., Hsu, D. & Vodrahalli, K.. (2019). Attribute-efficient learning of monomials over highly-correlated variables. Proceedings of the 30th International Conference on Algorithmic Learning Theory, in Proceedings of Machine Learning Research 98:127-161 Available from https://proceedings.mlr.press/v98/andoni19a.html.

Related Material