Efficient List-Decodable Regression using Batches

Abhimanyu Das, Ayush Jain, Weihao Kong, Rajat Sen
Proceedings of the 40th International Conference on Machine Learning, PMLR 202:7025-7065, 2023.

Abstract

We demonstrate the use of batches in studying list-decodable linear regression, in which only $\alpha\in (0,1]$ fraction of batches contain genuine samples from a common distribution and the rest can contain arbitrary or even adversarial samples. When genuine batches have $\ge \tilde\Omega(1/\alpha)$ samples each, our algorithm can efficiently find a small list of potential regression parameters, with a high probability that one of them is close to the true parameter. This is the first polynomial time algorithm for list-decodable linear regression, and its sample complexity scales nearly linearly with the dimension of the covariates. The polynomial time algorithm is made possible by the batch structure and may not be feasible without it, as suggested by a recent Statistical Query lower bound (Diakonikolas et al., 2021b).

Cite this Paper


BibTeX
@InProceedings{pmlr-v202-das23b, title = {Efficient List-Decodable Regression using Batches}, author = {Das, Abhimanyu and Jain, Ayush and Kong, Weihao and Sen, Rajat}, booktitle = {Proceedings of the 40th International Conference on Machine Learning}, pages = {7025--7065}, year = {2023}, editor = {Krause, Andreas and Brunskill, Emma and Cho, Kyunghyun and Engelhardt, Barbara and Sabato, Sivan and Scarlett, Jonathan}, volume = {202}, series = {Proceedings of Machine Learning Research}, month = {23--29 Jul}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v202/das23b/das23b.pdf}, url = {https://proceedings.mlr.press/v202/das23b.html}, abstract = {We demonstrate the use of batches in studying list-decodable linear regression, in which only $\alpha\in (0,1]$ fraction of batches contain genuine samples from a common distribution and the rest can contain arbitrary or even adversarial samples. When genuine batches have $\ge \tilde\Omega(1/\alpha)$ samples each, our algorithm can efficiently find a small list of potential regression parameters, with a high probability that one of them is close to the true parameter. This is the first polynomial time algorithm for list-decodable linear regression, and its sample complexity scales nearly linearly with the dimension of the covariates. The polynomial time algorithm is made possible by the batch structure and may not be feasible without it, as suggested by a recent Statistical Query lower bound (Diakonikolas et al., 2021b).} }
Endnote
%0 Conference Paper %T Efficient List-Decodable Regression using Batches %A Abhimanyu Das %A Ayush Jain %A Weihao Kong %A Rajat Sen %B Proceedings of the 40th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2023 %E Andreas Krause %E Emma Brunskill %E Kyunghyun Cho %E Barbara Engelhardt %E Sivan Sabato %E Jonathan Scarlett %F pmlr-v202-das23b %I PMLR %P 7025--7065 %U https://proceedings.mlr.press/v202/das23b.html %V 202 %X We demonstrate the use of batches in studying list-decodable linear regression, in which only $\alpha\in (0,1]$ fraction of batches contain genuine samples from a common distribution and the rest can contain arbitrary or even adversarial samples. When genuine batches have $\ge \tilde\Omega(1/\alpha)$ samples each, our algorithm can efficiently find a small list of potential regression parameters, with a high probability that one of them is close to the true parameter. This is the first polynomial time algorithm for list-decodable linear regression, and its sample complexity scales nearly linearly with the dimension of the covariates. The polynomial time algorithm is made possible by the batch structure and may not be feasible without it, as suggested by a recent Statistical Query lower bound (Diakonikolas et al., 2021b).
APA
Das, A., Jain, A., Kong, W. & Sen, R.. (2023). Efficient List-Decodable Regression using Batches. Proceedings of the 40th International Conference on Machine Learning, in Proceedings of Machine Learning Research 202:7025-7065 Available from https://proceedings.mlr.press/v202/das23b.html.

Related Material