Fast, Exact Model Selection and Permutation Testing for l2-Regularized Logistic Regression


Bryan Conroy, Paul Sajda ;
Proceedings of the Fifteenth International Conference on Artificial Intelligence and Statistics, PMLR 22:246-254, 2012.


Regularized logistic regression is a standard classification method used in statistics and machine learning. Unlike regularized least squares problems such as ridge regression, the parameter estimates cannot be computed in closed-form and instead must be estimated using an iterative technique. This paper addresses the computational problem of regularized logistic regression that is commonly encountered in model selection and classifier statistical significance testing, in which a large number of related logistic regression problems must be solved for. Our proposed approach solves the problems simultaneously through an iterative technique, which also garners computational efficiencies by leveraging the redundancies across the related problems. We demonstrate analytically that our method provides a substantial complexity reduction, which is further validated by our results on real-world datasets.

Related Material