Globally-convergent Iteratively Reweighted Least Squares for Robust Regression Problems

Bhaskar Mukhoty, Govind Gopakumar, Prateek Jain, Purushottam Kar
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:313-322, 2019.

Abstract

We provide the first global model recovery results for the IRLS (iteratively reweighted least squares) heuristic for robust regression problems. IRLS is known to offer excellent performance, despite bad initializations and data corruption, for several parameter estimation problems. Existing analyses of IRLS frequently require careful initialization, thus offering only local convergence guarantees. We remedy this by proposing augmentations to the basic IRLS routine that not only offer guaranteed global recovery, but in practice also outperform state-of-the-art algorithms for robust regression. Our routines are more immune to hyperparameter misspecification in basic regression tasks, as well as applied tasks such as linear-armed bandit problems. Our theoretical analyses rely on a novel extension of the notions of strong convexity and smoothness to weighted strong convexity and smoothness, and establishing that sub-Gaussian designs offer bounded weighted condition numbers. These notions may be useful in analyzing other algorithms as well.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-mukhoty19a, title = {Globally-convergent Iteratively Reweighted Least Squares for Robust Regression Problems}, author = {Mukhoty, Bhaskar and Gopakumar, Govind and Jain, Prateek and Kar, Purushottam}, booktitle = {Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics}, pages = {313--322}, year = {2019}, editor = {Chaudhuri, Kamalika and Sugiyama, Masashi}, volume = {89}, series = {Proceedings of Machine Learning Research}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/mukhoty19a/mukhoty19a.pdf}, url = {https://proceedings.mlr.press/v89/mukhoty19a.html}, abstract = {We provide the first global model recovery results for the IRLS (iteratively reweighted least squares) heuristic for robust regression problems. IRLS is known to offer excellent performance, despite bad initializations and data corruption, for several parameter estimation problems. Existing analyses of IRLS frequently require careful initialization, thus offering only local convergence guarantees. We remedy this by proposing augmentations to the basic IRLS routine that not only offer guaranteed global recovery, but in practice also outperform state-of-the-art algorithms for robust regression. Our routines are more immune to hyperparameter misspecification in basic regression tasks, as well as applied tasks such as linear-armed bandit problems. Our theoretical analyses rely on a novel extension of the notions of strong convexity and smoothness to weighted strong convexity and smoothness, and establishing that sub-Gaussian designs offer bounded weighted condition numbers. These notions may be useful in analyzing other algorithms as well.} }
Endnote
%0 Conference Paper %T Globally-convergent Iteratively Reweighted Least Squares for Robust Regression Problems %A Bhaskar Mukhoty %A Govind Gopakumar %A Prateek Jain %A Purushottam Kar %B Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-mukhoty19a %I PMLR %P 313--322 %U https://proceedings.mlr.press/v89/mukhoty19a.html %V 89 %X We provide the first global model recovery results for the IRLS (iteratively reweighted least squares) heuristic for robust regression problems. IRLS is known to offer excellent performance, despite bad initializations and data corruption, for several parameter estimation problems. Existing analyses of IRLS frequently require careful initialization, thus offering only local convergence guarantees. We remedy this by proposing augmentations to the basic IRLS routine that not only offer guaranteed global recovery, but in practice also outperform state-of-the-art algorithms for robust regression. Our routines are more immune to hyperparameter misspecification in basic regression tasks, as well as applied tasks such as linear-armed bandit problems. Our theoretical analyses rely on a novel extension of the notions of strong convexity and smoothness to weighted strong convexity and smoothness, and establishing that sub-Gaussian designs offer bounded weighted condition numbers. These notions may be useful in analyzing other algorithms as well.
APA
Mukhoty, B., Gopakumar, G., Jain, P. & Kar, P.. (2019). Globally-convergent Iteratively Reweighted Least Squares for Robust Regression Problems. Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 89:313-322 Available from https://proceedings.mlr.press/v89/mukhoty19a.html.

Related Material