Sufficient Dimension Reduction via Direct Estimation of the Gradients of Logarithmic Conditional Densities

Hiroaki Sasaki, Voot Tangkaratt, Masashi Sugiyama
Asian Conference on Machine Learning, PMLR 45:33-48, 2016.

Abstract

Sufficient dimension reduction (SDR) is a framework of supervised linear dimension reduction, and is aimed at finding a low-dimensional orthogonal projection matrix for input data such that the projected input data retains maximal information on output data. A computationally efficient approach employs gradient estimates of the conditional density of the output given input data to find an appropriate projection matrix. However, since the gradients of the conditional densities are typically estimated by a local linear smoother, it does not perform well when the input dimensionality is high. In this paper, we propose a novel estimator of the gradients of logarithmic conditional densities called the \emphleast-squares logarithmic conditional density gradients (LSLCG), which fits a gradient model \emphdirectly to the true gradient without conditional density estimation under the squared loss. Thanks to the simple least-squares formulation, LSLCG gives a closed-form solution that can be computed efficiently. In addition, all the parameters can be automatically determined by cross-validation. Through experiments on a large variety of artificial and benchmark datasets, we demonstrate that the SDR method based on LSLCG outperforms existing SDR methods both in estimation accuracy and computational efficiency.

Cite this Paper


BibTeX
@InProceedings{pmlr-v45-Sasaki15, title = {Sufficient Dimension Reduction via Direct Estimation of the Gradients of Logarithmic Conditional Densities}, author = {Sasaki, Hiroaki and Tangkaratt, Voot and Sugiyama, Masashi}, booktitle = {Asian Conference on Machine Learning}, pages = {33--48}, year = {2016}, editor = {Holmes, Geoffrey and Liu, Tie-Yan}, volume = {45}, series = {Proceedings of Machine Learning Research}, address = {Hong Kong}, month = {20--22 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v45/Sasaki15.pdf}, url = {https://proceedings.mlr.press/v45/Sasaki15.html}, abstract = {Sufficient dimension reduction (SDR) is a framework of supervised linear dimension reduction, and is aimed at finding a low-dimensional orthogonal projection matrix for input data such that the projected input data retains maximal information on output data. A computationally efficient approach employs gradient estimates of the conditional density of the output given input data to find an appropriate projection matrix. However, since the gradients of the conditional densities are typically estimated by a local linear smoother, it does not perform well when the input dimensionality is high. In this paper, we propose a novel estimator of the gradients of logarithmic conditional densities called the \emphleast-squares logarithmic conditional density gradients (LSLCG), which fits a gradient model \emphdirectly to the true gradient without conditional density estimation under the squared loss. Thanks to the simple least-squares formulation, LSLCG gives a closed-form solution that can be computed efficiently. In addition, all the parameters can be automatically determined by cross-validation. Through experiments on a large variety of artificial and benchmark datasets, we demonstrate that the SDR method based on LSLCG outperforms existing SDR methods both in estimation accuracy and computational efficiency.} }
Endnote
%0 Conference Paper %T Sufficient Dimension Reduction via Direct Estimation of the Gradients of Logarithmic Conditional Densities %A Hiroaki Sasaki %A Voot Tangkaratt %A Masashi Sugiyama %B Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2016 %E Geoffrey Holmes %E Tie-Yan Liu %F pmlr-v45-Sasaki15 %I PMLR %P 33--48 %U https://proceedings.mlr.press/v45/Sasaki15.html %V 45 %X Sufficient dimension reduction (SDR) is a framework of supervised linear dimension reduction, and is aimed at finding a low-dimensional orthogonal projection matrix for input data such that the projected input data retains maximal information on output data. A computationally efficient approach employs gradient estimates of the conditional density of the output given input data to find an appropriate projection matrix. However, since the gradients of the conditional densities are typically estimated by a local linear smoother, it does not perform well when the input dimensionality is high. In this paper, we propose a novel estimator of the gradients of logarithmic conditional densities called the \emphleast-squares logarithmic conditional density gradients (LSLCG), which fits a gradient model \emphdirectly to the true gradient without conditional density estimation under the squared loss. Thanks to the simple least-squares formulation, LSLCG gives a closed-form solution that can be computed efficiently. In addition, all the parameters can be automatically determined by cross-validation. Through experiments on a large variety of artificial and benchmark datasets, we demonstrate that the SDR method based on LSLCG outperforms existing SDR methods both in estimation accuracy and computational efficiency.
RIS
TY - CPAPER TI - Sufficient Dimension Reduction via Direct Estimation of the Gradients of Logarithmic Conditional Densities AU - Hiroaki Sasaki AU - Voot Tangkaratt AU - Masashi Sugiyama BT - Asian Conference on Machine Learning DA - 2016/02/25 ED - Geoffrey Holmes ED - Tie-Yan Liu ID - pmlr-v45-Sasaki15 PB - PMLR DP - Proceedings of Machine Learning Research VL - 45 SP - 33 EP - 48 L1 - http://proceedings.mlr.press/v45/Sasaki15.pdf UR - https://proceedings.mlr.press/v45/Sasaki15.html AB - Sufficient dimension reduction (SDR) is a framework of supervised linear dimension reduction, and is aimed at finding a low-dimensional orthogonal projection matrix for input data such that the projected input data retains maximal information on output data. A computationally efficient approach employs gradient estimates of the conditional density of the output given input data to find an appropriate projection matrix. However, since the gradients of the conditional densities are typically estimated by a local linear smoother, it does not perform well when the input dimensionality is high. In this paper, we propose a novel estimator of the gradients of logarithmic conditional densities called the \emphleast-squares logarithmic conditional density gradients (LSLCG), which fits a gradient model \emphdirectly to the true gradient without conditional density estimation under the squared loss. Thanks to the simple least-squares formulation, LSLCG gives a closed-form solution that can be computed efficiently. In addition, all the parameters can be automatically determined by cross-validation. Through experiments on a large variety of artificial and benchmark datasets, we demonstrate that the SDR method based on LSLCG outperforms existing SDR methods both in estimation accuracy and computational efficiency. ER -
APA
Sasaki, H., Tangkaratt, V. & Sugiyama, M.. (2016). Sufficient Dimension Reduction via Direct Estimation of the Gradients of Logarithmic Conditional Densities. Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 45:33-48 Available from https://proceedings.mlr.press/v45/Sasaki15.html.

Related Material