Improving the Stability of the Knockoff Procedure: Multiple Simultaneous Knockoffs and Entropy Maximization

Jaime Roquero Gimenez, James Zou
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:2184-2192, 2019.

Abstract

The Model-X knockoff procedure has recently emerged as a powerful approach for feature selection with statistical guarantees. The advantage of knockoffs is that if we have a good model of the features X, then we can identify salient features without knowing anything about how the outcome Y depends on X. An important drawback of knockoffs is its instability: running the procedure twice can result in very different selected features, potentially leading to different conclusions. Addressing this instability is critical for obtaining reproducible and robust results. Here we present a generalization of the knockoff procedure that we call simultaneous multi-knockoffs. We show that multi-knockoffs guarantee false discovery rate (FDR) control, and are substantially more stable and powerful compared to the standard (single) knockoffs. Moreover we propose a new algorithm based on entropy maximization for generating Gaussian multi-knockoffs. We validate the improved stability and power of multi-knockoffs in systematic experiments. We also illustrate how multi-knockoffs can improve the accuracy of detecting genetic mutations that are causally linked to phenotypes.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-gimenez19b, title = {Improving the Stability of the Knockoff Procedure: Multiple Simultaneous Knockoffs and Entropy Maximization}, author = {Gimenez, Jaime Roquero and Zou, James}, booktitle = {Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics}, pages = {2184--2192}, year = {2019}, editor = {Chaudhuri, Kamalika and Sugiyama, Masashi}, volume = {89}, series = {Proceedings of Machine Learning Research}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/gimenez19b/gimenez19b.pdf}, url = {https://proceedings.mlr.press/v89/gimenez19b.html}, abstract = {The Model-X knockoff procedure has recently emerged as a powerful approach for feature selection with statistical guarantees. The advantage of knockoffs is that if we have a good model of the features X, then we can identify salient features without knowing anything about how the outcome Y depends on X. An important drawback of knockoffs is its instability: running the procedure twice can result in very different selected features, potentially leading to different conclusions. Addressing this instability is critical for obtaining reproducible and robust results. Here we present a generalization of the knockoff procedure that we call simultaneous multi-knockoffs. We show that multi-knockoffs guarantee false discovery rate (FDR) control, and are substantially more stable and powerful compared to the standard (single) knockoffs. Moreover we propose a new algorithm based on entropy maximization for generating Gaussian multi-knockoffs. We validate the improved stability and power of multi-knockoffs in systematic experiments. We also illustrate how multi-knockoffs can improve the accuracy of detecting genetic mutations that are causally linked to phenotypes.} }
Endnote
%0 Conference Paper %T Improving the Stability of the Knockoff Procedure: Multiple Simultaneous Knockoffs and Entropy Maximization %A Jaime Roquero Gimenez %A James Zou %B Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-gimenez19b %I PMLR %P 2184--2192 %U https://proceedings.mlr.press/v89/gimenez19b.html %V 89 %X The Model-X knockoff procedure has recently emerged as a powerful approach for feature selection with statistical guarantees. The advantage of knockoffs is that if we have a good model of the features X, then we can identify salient features without knowing anything about how the outcome Y depends on X. An important drawback of knockoffs is its instability: running the procedure twice can result in very different selected features, potentially leading to different conclusions. Addressing this instability is critical for obtaining reproducible and robust results. Here we present a generalization of the knockoff procedure that we call simultaneous multi-knockoffs. We show that multi-knockoffs guarantee false discovery rate (FDR) control, and are substantially more stable and powerful compared to the standard (single) knockoffs. Moreover we propose a new algorithm based on entropy maximization for generating Gaussian multi-knockoffs. We validate the improved stability and power of multi-knockoffs in systematic experiments. We also illustrate how multi-knockoffs can improve the accuracy of detecting genetic mutations that are causally linked to phenotypes.
APA
Gimenez, J.R. & Zou, J.. (2019). Improving the Stability of the Knockoff Procedure: Multiple Simultaneous Knockoffs and Entropy Maximization. Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 89:2184-2192 Available from https://proceedings.mlr.press/v89/gimenez19b.html.

Related Material