Computationally Efficient Sufficient Dimension Reduction via Squared-Loss Mutual Information

[edit]

Makoto Yamada, Gang Niu, Jun Takagi, Masashi Sugiyama ;
Proceedings of the Asian Conference on Machine Learning, PMLR 20:247-262, 2011.

Abstract

The purpose of sufficient dimension reduction (SDR) is to find a low-dimensional expression of input features that is sufficient for predicting output values. In this paper, we propose a novel distribution-free SDR method called sufficient component analysis (SCA), which is computationally more efficient than existing methods. In our method, a solution is computed by iteratively performing dependence estimation and maximization: Dependence estimation is analytically carried out by recently-proposed least-squares mutual information (LSMI), and dependence maximization is also analytically carried out by utilizing the Epanechnikov kernel. Through large-scale experiments on real-world image classification and audio tagging problems, the proposed method is shown to compare favorably with existing dimension reduction approaches.

Related Material