Sufficient Dimension Reduction via Squared-loss Mutual Information Estimation

Taiji Suzuki, Masashi Sugiyama
Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, PMLR 9:804-811, 2010.

Abstract

The goal of sufficient dimension reduction in supervised learning is to find the low dimensional subspace of input features that is "sufficient" for predicting output values. In this paper, we propose a novel sufficient dimension reduction method using a squaredloss variant of mutual information as a dependency measure. We utilize an analytic approximator of squared-loss mutual information based on density ratio estimation, which is shown to possess suitable convergence properties. We then develop a natural gradient algorithm for sufficient subspace search. Numerical experiments show that the proposed method compares favorably with existing dimension reduction approaches.

Cite this Paper


BibTeX
@InProceedings{pmlr-v9-suzuki10a, title = {Sufficient Dimension Reduction via Squared-loss Mutual Information Estimation}, author = {Suzuki, Taiji and Sugiyama, Masashi}, booktitle = {Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics}, pages = {804--811}, year = {2010}, editor = {Teh, Yee Whye and Titterington, Mike}, volume = {9}, series = {Proceedings of Machine Learning Research}, address = {Chia Laguna Resort, Sardinia, Italy}, month = {13--15 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v9/suzuki10a/suzuki10a.pdf}, url = {https://proceedings.mlr.press/v9/suzuki10a.html}, abstract = {The goal of sufficient dimension reduction in supervised learning is to find the low dimensional subspace of input features that is "sufficient" for predicting output values. In this paper, we propose a novel sufficient dimension reduction method using a squaredloss variant of mutual information as a dependency measure. We utilize an analytic approximator of squared-loss mutual information based on density ratio estimation, which is shown to possess suitable convergence properties. We then develop a natural gradient algorithm for sufficient subspace search. Numerical experiments show that the proposed method compares favorably with existing dimension reduction approaches.} }
Endnote
%0 Conference Paper %T Sufficient Dimension Reduction via Squared-loss Mutual Information Estimation %A Taiji Suzuki %A Masashi Sugiyama %B Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2010 %E Yee Whye Teh %E Mike Titterington %F pmlr-v9-suzuki10a %I PMLR %P 804--811 %U https://proceedings.mlr.press/v9/suzuki10a.html %V 9 %X The goal of sufficient dimension reduction in supervised learning is to find the low dimensional subspace of input features that is "sufficient" for predicting output values. In this paper, we propose a novel sufficient dimension reduction method using a squaredloss variant of mutual information as a dependency measure. We utilize an analytic approximator of squared-loss mutual information based on density ratio estimation, which is shown to possess suitable convergence properties. We then develop a natural gradient algorithm for sufficient subspace search. Numerical experiments show that the proposed method compares favorably with existing dimension reduction approaches.
RIS
TY - CPAPER TI - Sufficient Dimension Reduction via Squared-loss Mutual Information Estimation AU - Taiji Suzuki AU - Masashi Sugiyama BT - Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics DA - 2010/03/31 ED - Yee Whye Teh ED - Mike Titterington ID - pmlr-v9-suzuki10a PB - PMLR DP - Proceedings of Machine Learning Research VL - 9 SP - 804 EP - 811 L1 - http://proceedings.mlr.press/v9/suzuki10a/suzuki10a.pdf UR - https://proceedings.mlr.press/v9/suzuki10a.html AB - The goal of sufficient dimension reduction in supervised learning is to find the low dimensional subspace of input features that is "sufficient" for predicting output values. In this paper, we propose a novel sufficient dimension reduction method using a squaredloss variant of mutual information as a dependency measure. We utilize an analytic approximator of squared-loss mutual information based on density ratio estimation, which is shown to possess suitable convergence properties. We then develop a natural gradient algorithm for sufficient subspace search. Numerical experiments show that the proposed method compares favorably with existing dimension reduction approaches. ER -
APA
Suzuki, T. & Sugiyama, M.. (2010). Sufficient Dimension Reduction via Squared-loss Mutual Information Estimation. Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 9:804-811 Available from https://proceedings.mlr.press/v9/suzuki10a.html.

Related Material