Computationally Efficient Sufficient Dimension Reduction via Squared-Loss Mutual Information

Makoto Yamada, Gang Niu, Jun Takagi, Masashi Sugiyama
Proceedings of the Asian Conference on Machine Learning, PMLR 20:247-262, 2011.

Abstract

The purpose of sufficient dimension reduction (SDR) is to find a low-dimensional expression of input features that is sufficient for predicting output values. In this paper, we propose a novel distribution-free SDR method called sufficient component analysis (SCA), which is computationally more efficient than existing methods. In our method, a solution is computed by iteratively performing dependence estimation and maximization: Dependence estimation is analytically carried out by recently-proposed least-squares mutual information (LSMI), and dependence maximization is also analytically carried out by utilizing the Epanechnikov kernel. Through large-scale experiments on real-world image classification and audio tagging problems, the proposed method is shown to compare favorably with existing dimension reduction approaches.

Cite this Paper


BibTeX
@InProceedings{pmlr-v20-yamada11, title = {Computationally Efficient Sufficient Dimension Reduction via Squared-Loss Mutual Information}, author = {Yamada, Makoto and Niu, Gang and Takagi, Jun and Sugiyama, Masashi}, booktitle = {Proceedings of the Asian Conference on Machine Learning}, pages = {247--262}, year = {2011}, editor = {Hsu, Chun-Nan and Lee, Wee Sun}, volume = {20}, series = {Proceedings of Machine Learning Research}, address = {South Garden Hotels and Resorts, Taoyuan, Taiwain}, month = {14--15 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v20/yamada11/yamada11.pdf}, url = {https://proceedings.mlr.press/v20/yamada11.html}, abstract = {The purpose of sufficient dimension reduction (SDR) is to find a low-dimensional expression of input features that is sufficient for predicting output values. In this paper, we propose a novel distribution-free SDR method called sufficient component analysis (SCA), which is computationally more efficient than existing methods. In our method, a solution is computed by iteratively performing dependence estimation and maximization: Dependence estimation is analytically carried out by recently-proposed least-squares mutual information (LSMI), and dependence maximization is also analytically carried out by utilizing the Epanechnikov kernel. Through large-scale experiments on real-world image classification and audio tagging problems, the proposed method is shown to compare favorably with existing dimension reduction approaches.} }
Endnote
%0 Conference Paper %T Computationally Efficient Sufficient Dimension Reduction via Squared-Loss Mutual Information %A Makoto Yamada %A Gang Niu %A Jun Takagi %A Masashi Sugiyama %B Proceedings of the Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2011 %E Chun-Nan Hsu %E Wee Sun Lee %F pmlr-v20-yamada11 %I PMLR %P 247--262 %U https://proceedings.mlr.press/v20/yamada11.html %V 20 %X The purpose of sufficient dimension reduction (SDR) is to find a low-dimensional expression of input features that is sufficient for predicting output values. In this paper, we propose a novel distribution-free SDR method called sufficient component analysis (SCA), which is computationally more efficient than existing methods. In our method, a solution is computed by iteratively performing dependence estimation and maximization: Dependence estimation is analytically carried out by recently-proposed least-squares mutual information (LSMI), and dependence maximization is also analytically carried out by utilizing the Epanechnikov kernel. Through large-scale experiments on real-world image classification and audio tagging problems, the proposed method is shown to compare favorably with existing dimension reduction approaches.
RIS
TY - CPAPER TI - Computationally Efficient Sufficient Dimension Reduction via Squared-Loss Mutual Information AU - Makoto Yamada AU - Gang Niu AU - Jun Takagi AU - Masashi Sugiyama BT - Proceedings of the Asian Conference on Machine Learning DA - 2011/11/17 ED - Chun-Nan Hsu ED - Wee Sun Lee ID - pmlr-v20-yamada11 PB - PMLR DP - Proceedings of Machine Learning Research VL - 20 SP - 247 EP - 262 L1 - http://proceedings.mlr.press/v20/yamada11/yamada11.pdf UR - https://proceedings.mlr.press/v20/yamada11.html AB - The purpose of sufficient dimension reduction (SDR) is to find a low-dimensional expression of input features that is sufficient for predicting output values. In this paper, we propose a novel distribution-free SDR method called sufficient component analysis (SCA), which is computationally more efficient than existing methods. In our method, a solution is computed by iteratively performing dependence estimation and maximization: Dependence estimation is analytically carried out by recently-proposed least-squares mutual information (LSMI), and dependence maximization is also analytically carried out by utilizing the Epanechnikov kernel. Through large-scale experiments on real-world image classification and audio tagging problems, the proposed method is shown to compare favorably with existing dimension reduction approaches. ER -
APA
Yamada, M., Niu, G., Takagi, J. & Sugiyama, M.. (2011). Computationally Efficient Sufficient Dimension Reduction via Squared-Loss Mutual Information. Proceedings of the Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 20:247-262 Available from https://proceedings.mlr.press/v20/yamada11.html.

Related Material