Correcting the bias in least squares regression with volume-rescaled sampling

Michal Derezinski, Manfred K. Warmuth, Daniel Hsu
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:944-953, 2019.

Abstract

Consider linear regression where the examples are generated by an unknown distribution on R^d x R. Without any assumptions on the noise, the linear least squares solution for any i.i.d. sample will typically be biased w.r.t. the least squares optimum over the entire distribution. However, we show that if an i.i.d. sample of any size k is augmented by a certain small additional sample, then the solution of the combined sample becomes unbiased. We show this when the additional sample consists of d points drawn jointly according to the input distribution rescaled by the squared volume spanned by the points. Furthermore, we propose algorithms to sample from this volume-rescaled distribution when the data distribution is only known through an i.i.d sample.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-derezinski19a, title = {Correcting the bias in least squares regression with volume-rescaled sampling}, author = {Derezinski, Michal and Warmuth, Manfred K. and Hsu, Daniel}, booktitle = {Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics}, pages = {944--953}, year = {2019}, editor = {Chaudhuri, Kamalika and Sugiyama, Masashi}, volume = {89}, series = {Proceedings of Machine Learning Research}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/derezinski19a/derezinski19a.pdf}, url = {https://proceedings.mlr.press/v89/derezinski19a.html}, abstract = {Consider linear regression where the examples are generated by an unknown distribution on R^d x R. Without any assumptions on the noise, the linear least squares solution for any i.i.d. sample will typically be biased w.r.t. the least squares optimum over the entire distribution. However, we show that if an i.i.d. sample of any size k is augmented by a certain small additional sample, then the solution of the combined sample becomes unbiased. We show this when the additional sample consists of d points drawn jointly according to the input distribution rescaled by the squared volume spanned by the points. Furthermore, we propose algorithms to sample from this volume-rescaled distribution when the data distribution is only known through an i.i.d sample.} }
Endnote
%0 Conference Paper %T Correcting the bias in least squares regression with volume-rescaled sampling %A Michal Derezinski %A Manfred K. Warmuth %A Daniel Hsu %B Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-derezinski19a %I PMLR %P 944--953 %U https://proceedings.mlr.press/v89/derezinski19a.html %V 89 %X Consider linear regression where the examples are generated by an unknown distribution on R^d x R. Without any assumptions on the noise, the linear least squares solution for any i.i.d. sample will typically be biased w.r.t. the least squares optimum over the entire distribution. However, we show that if an i.i.d. sample of any size k is augmented by a certain small additional sample, then the solution of the combined sample becomes unbiased. We show this when the additional sample consists of d points drawn jointly according to the input distribution rescaled by the squared volume spanned by the points. Furthermore, we propose algorithms to sample from this volume-rescaled distribution when the data distribution is only known through an i.i.d sample.
APA
Derezinski, M., Warmuth, M.K. & Hsu, D.. (2019). Correcting the bias in least squares regression with volume-rescaled sampling. Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 89:944-953 Available from https://proceedings.mlr.press/v89/derezinski19a.html.

Related Material