High-dimensional Robust Mean Estimation via Gradient Descent

Yu Cheng, Ilias Diakonikolas, Rong Ge, Mahdi Soltanolkotabi
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:1768-1778, 2020.

Abstract

We study the problem of high-dimensional robust mean estimation in the presence of a constant fraction of adversarial outliers. A recent line of work has provided sophisticated polynomial-time algorithms for this problem with dimension-independent error guarantees for a range of natural distribution families. In this work, we show that a natural non-convex formulation of the problem can be solved directly by gradient descent. Our approach leverages a novel structural lemma, roughly showing that any approximate stationary point of our non-convex objective gives a near-optimal solution to the underlying robust estimation task. Our work establishes an intriguing connection between algorithmic high-dimensional robust statistics and non-convex optimization, which may have broader applications to other robust estimation tasks.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-cheng20a, title = {High-dimensional Robust Mean Estimation via Gradient Descent}, author = {Cheng, Yu and Diakonikolas, Ilias and Ge, Rong and Soltanolkotabi, Mahdi}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {1768--1778}, year = {2020}, editor = {Hal Daumé III and Aarti Singh}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/cheng20a/cheng20a.pdf}, url = { http://proceedings.mlr.press/v119/cheng20a.html }, abstract = {We study the problem of high-dimensional robust mean estimation in the presence of a constant fraction of adversarial outliers. A recent line of work has provided sophisticated polynomial-time algorithms for this problem with dimension-independent error guarantees for a range of natural distribution families. In this work, we show that a natural non-convex formulation of the problem can be solved directly by gradient descent. Our approach leverages a novel structural lemma, roughly showing that any approximate stationary point of our non-convex objective gives a near-optimal solution to the underlying robust estimation task. Our work establishes an intriguing connection between algorithmic high-dimensional robust statistics and non-convex optimization, which may have broader applications to other robust estimation tasks.} }
Endnote
%0 Conference Paper %T High-dimensional Robust Mean Estimation via Gradient Descent %A Yu Cheng %A Ilias Diakonikolas %A Rong Ge %A Mahdi Soltanolkotabi %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-cheng20a %I PMLR %P 1768--1778 %U http://proceedings.mlr.press/v119/cheng20a.html %V 119 %X We study the problem of high-dimensional robust mean estimation in the presence of a constant fraction of adversarial outliers. A recent line of work has provided sophisticated polynomial-time algorithms for this problem with dimension-independent error guarantees for a range of natural distribution families. In this work, we show that a natural non-convex formulation of the problem can be solved directly by gradient descent. Our approach leverages a novel structural lemma, roughly showing that any approximate stationary point of our non-convex objective gives a near-optimal solution to the underlying robust estimation task. Our work establishes an intriguing connection between algorithmic high-dimensional robust statistics and non-convex optimization, which may have broader applications to other robust estimation tasks.
APA
Cheng, Y., Diakonikolas, I., Ge, R. & Soltanolkotabi, M.. (2020). High-dimensional Robust Mean Estimation via Gradient Descent. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:1768-1778 Available from http://proceedings.mlr.press/v119/cheng20a.html .

Related Material