Sufficient Dimension Reduction via Squared-loss Mutual Information Estimation
Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, PMLR 9:804-811, 2010.
The goal of sufficient dimension reduction in supervised learning is to find the low dimensional subspace of input features that is "sufficient" for predicting output values. In this paper, we propose a novel sufficient dimension reduction method using a squared-loss variant of mutual information as a dependency measure. We utilize an analytic approximator of squared-loss mutual information based on density ratio estimation, which is shown to possess suitable convergence properties. We then develop a natural gradient algorithm for sufficient subspace search. Numerical experiments show that the proposed method compares favorably with existing dimension reduction approaches.