High Dimensional Inference in Partially Linear Models

Ying Zhu, Zhuqing Yu, Guang Cheng
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:2760-2769, 2019.

Abstract

We propose two semiparametric versions of the debiased Lasso procedure for the model $Y_{i}=X_{i}\beta_{0}+g_{0}(Z_{i})+\varepsilon_{i}$, where the parameter vector of interest $\beta_{0}$ is high dimensional but sparse (exactly or approximately) and $g_{0}$ is an unknown nuisance function. Both versions are shown to have the same asymptotic normal distribution and do not require the minimal signal condition for statistical inference of any component in $\beta_{0}$. We further develop a simultaneous hypothesis testing procedure based on multiplier bootstrap. Our testing method takes into account of the dependence structure within the debiased estimates, and allows the number of tested components to be exponentially high.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-zhu19c, title = {High Dimensional Inference in Partially Linear Models}, author = {Zhu, Ying and Yu, Zhuqing and Cheng, Guang}, booktitle = {Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics}, pages = {2760--2769}, year = {2019}, editor = {Chaudhuri, Kamalika and Sugiyama, Masashi}, volume = {89}, series = {Proceedings of Machine Learning Research}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/zhu19c/zhu19c.pdf}, url = {https://proceedings.mlr.press/v89/zhu19c.html}, abstract = {We propose two semiparametric versions of the debiased Lasso procedure for the model $Y_{i}=X_{i}\beta_{0}+g_{0}(Z_{i})+\varepsilon_{i}$, where the parameter vector of interest $\beta_{0}$ is high dimensional but sparse (exactly or approximately) and $g_{0}$ is an unknown nuisance function. Both versions are shown to have the same asymptotic normal distribution and do not require the minimal signal condition for statistical inference of any component in $\beta_{0}$. We further develop a simultaneous hypothesis testing procedure based on multiplier bootstrap. Our testing method takes into account of the dependence structure within the debiased estimates, and allows the number of tested components to be exponentially high.} }
Endnote
%0 Conference Paper %T High Dimensional Inference in Partially Linear Models %A Ying Zhu %A Zhuqing Yu %A Guang Cheng %B Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-zhu19c %I PMLR %P 2760--2769 %U https://proceedings.mlr.press/v89/zhu19c.html %V 89 %X We propose two semiparametric versions of the debiased Lasso procedure for the model $Y_{i}=X_{i}\beta_{0}+g_{0}(Z_{i})+\varepsilon_{i}$, where the parameter vector of interest $\beta_{0}$ is high dimensional but sparse (exactly or approximately) and $g_{0}$ is an unknown nuisance function. Both versions are shown to have the same asymptotic normal distribution and do not require the minimal signal condition for statistical inference of any component in $\beta_{0}$. We further develop a simultaneous hypothesis testing procedure based on multiplier bootstrap. Our testing method takes into account of the dependence structure within the debiased estimates, and allows the number of tested components to be exponentially high.
APA
Zhu, Y., Yu, Z. & Cheng, G.. (2019). High Dimensional Inference in Partially Linear Models. Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 89:2760-2769 Available from https://proceedings.mlr.press/v89/zhu19c.html.

Related Material