More Powerful and General Selective Inference for Stepwise Feature Selection using Homotopy Method

Kazuya Sugiyama, Vo Nguyen Le Duy, Ichiro Takeuchi
Proceedings of the 38th International Conference on Machine Learning, PMLR 139:9891-9901, 2021.

Abstract

Conditional selective inference (SI) has been actively studied as a new statistical inference framework for data-driven hypotheses. The basic idea of conditional SI is to make inferences conditional on the selection event characterized by a set of linear and/or quadratic inequalities. Conditional SI has been mainly studied in the context of feature selection such as stepwise feature selection (SFS). The main limitation of the existing conditional SI methods is the loss of power due to over-conditioning, which is required for computational tractability. In this study, we develop a more powerful and general conditional SI method for SFS using the homotopy method which enables us to overcome this limitation. The homotopy-based SI is especially effective for more complicated feature selection algorithms. As an example, we develop a conditional SI method for forward-backward SFS with AIC-based stopping criteria and show that it is not adversely affected by the increased complexity of the algorithm. We conduct several experiments to demonstrate the effectiveness and efficiency of the proposed method.

Cite this Paper


BibTeX
@InProceedings{pmlr-v139-sugiyama21a, title = {More Powerful and General Selective Inference for Stepwise Feature Selection using Homotopy Method}, author = {Sugiyama, Kazuya and Duy, Vo Nguyen Le and Takeuchi, Ichiro}, booktitle = {Proceedings of the 38th International Conference on Machine Learning}, pages = {9891--9901}, year = {2021}, editor = {Meila, Marina and Zhang, Tong}, volume = {139}, series = {Proceedings of Machine Learning Research}, month = {18--24 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v139/sugiyama21a/sugiyama21a.pdf}, url = {https://proceedings.mlr.press/v139/sugiyama21a.html}, abstract = {Conditional selective inference (SI) has been actively studied as a new statistical inference framework for data-driven hypotheses. The basic idea of conditional SI is to make inferences conditional on the selection event characterized by a set of linear and/or quadratic inequalities. Conditional SI has been mainly studied in the context of feature selection such as stepwise feature selection (SFS). The main limitation of the existing conditional SI methods is the loss of power due to over-conditioning, which is required for computational tractability. In this study, we develop a more powerful and general conditional SI method for SFS using the homotopy method which enables us to overcome this limitation. The homotopy-based SI is especially effective for more complicated feature selection algorithms. As an example, we develop a conditional SI method for forward-backward SFS with AIC-based stopping criteria and show that it is not adversely affected by the increased complexity of the algorithm. We conduct several experiments to demonstrate the effectiveness and efficiency of the proposed method.} }
Endnote
%0 Conference Paper %T More Powerful and General Selective Inference for Stepwise Feature Selection using Homotopy Method %A Kazuya Sugiyama %A Vo Nguyen Le Duy %A Ichiro Takeuchi %B Proceedings of the 38th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2021 %E Marina Meila %E Tong Zhang %F pmlr-v139-sugiyama21a %I PMLR %P 9891--9901 %U https://proceedings.mlr.press/v139/sugiyama21a.html %V 139 %X Conditional selective inference (SI) has been actively studied as a new statistical inference framework for data-driven hypotheses. The basic idea of conditional SI is to make inferences conditional on the selection event characterized by a set of linear and/or quadratic inequalities. Conditional SI has been mainly studied in the context of feature selection such as stepwise feature selection (SFS). The main limitation of the existing conditional SI methods is the loss of power due to over-conditioning, which is required for computational tractability. In this study, we develop a more powerful and general conditional SI method for SFS using the homotopy method which enables us to overcome this limitation. The homotopy-based SI is especially effective for more complicated feature selection algorithms. As an example, we develop a conditional SI method for forward-backward SFS with AIC-based stopping criteria and show that it is not adversely affected by the increased complexity of the algorithm. We conduct several experiments to demonstrate the effectiveness and efficiency of the proposed method.
APA
Sugiyama, K., Duy, V.N.L. & Takeuchi, I.. (2021). More Powerful and General Selective Inference for Stepwise Feature Selection using Homotopy Method. Proceedings of the 38th International Conference on Machine Learning, in Proceedings of Machine Learning Research 139:9891-9901 Available from https://proceedings.mlr.press/v139/sugiyama21a.html.

Related Material