Non-Gaussian Component Analysis with Log-Density Gradient Estimation

Hiroaki Sasaki, Gang Niu, Masashi Sugiyama
; Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, PMLR 51:1177-1185, 2016.

Abstract

Non-Gaussian component analysis (NGCA) is aimed at identifying a linear subspace such that the projected data follows a non-Gaussian distribution. In this paper, we propose a novel NGCA algorithm based on log-density gradient estimation. Unlike existing methods, the proposed NGCA algorithm identifies the linear subspace by using the eigenvalue decomposition without any iterative procedures, and thus is computationally reasonable. Furthermore, through theoretical analysis, we prove that the identified subspace converges to the true subspace at the optimal parametric rate. Finally, the practical performance of the proposed algorithm is demonstrated on both artificial and benchmark datasets.

Cite this Paper


BibTeX
@InProceedings{pmlr-v51-sasaki16, title = {Non-Gaussian Component Analysis with Log-Density Gradient Estimation}, author = {Hiroaki Sasaki and Gang Niu and Masashi Sugiyama}, pages = {1177--1185}, year = {2016}, editor = {Arthur Gretton and Christian C. Robert}, volume = {51}, series = {Proceedings of Machine Learning Research}, address = {Cadiz, Spain}, month = {09--11 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v51/sasaki16.pdf}, url = {http://proceedings.mlr.press/v51/sasaki16.html}, abstract = {Non-Gaussian component analysis (NGCA) is aimed at identifying a linear subspace such that the projected data follows a non-Gaussian distribution. In this paper, we propose a novel NGCA algorithm based on log-density gradient estimation. Unlike existing methods, the proposed NGCA algorithm identifies the linear subspace by using the eigenvalue decomposition without any iterative procedures, and thus is computationally reasonable. Furthermore, through theoretical analysis, we prove that the identified subspace converges to the true subspace at the optimal parametric rate. Finally, the practical performance of the proposed algorithm is demonstrated on both artificial and benchmark datasets.} }
Endnote
%0 Conference Paper %T Non-Gaussian Component Analysis with Log-Density Gradient Estimation %A Hiroaki Sasaki %A Gang Niu %A Masashi Sugiyama %B Proceedings of the 19th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2016 %E Arthur Gretton %E Christian C. Robert %F pmlr-v51-sasaki16 %I PMLR %J Proceedings of Machine Learning Research %P 1177--1185 %U http://proceedings.mlr.press %V 51 %W PMLR %X Non-Gaussian component analysis (NGCA) is aimed at identifying a linear subspace such that the projected data follows a non-Gaussian distribution. In this paper, we propose a novel NGCA algorithm based on log-density gradient estimation. Unlike existing methods, the proposed NGCA algorithm identifies the linear subspace by using the eigenvalue decomposition without any iterative procedures, and thus is computationally reasonable. Furthermore, through theoretical analysis, we prove that the identified subspace converges to the true subspace at the optimal parametric rate. Finally, the practical performance of the proposed algorithm is demonstrated on both artificial and benchmark datasets.
RIS
TY - CPAPER TI - Non-Gaussian Component Analysis with Log-Density Gradient Estimation AU - Hiroaki Sasaki AU - Gang Niu AU - Masashi Sugiyama BT - Proceedings of the 19th International Conference on Artificial Intelligence and Statistics PY - 2016/05/02 DA - 2016/05/02 ED - Arthur Gretton ED - Christian C. Robert ID - pmlr-v51-sasaki16 PB - PMLR SP - 1177 DP - PMLR EP - 1185 L1 - http://proceedings.mlr.press/v51/sasaki16.pdf UR - http://proceedings.mlr.press/v51/sasaki16.html AB - Non-Gaussian component analysis (NGCA) is aimed at identifying a linear subspace such that the projected data follows a non-Gaussian distribution. In this paper, we propose a novel NGCA algorithm based on log-density gradient estimation. Unlike existing methods, the proposed NGCA algorithm identifies the linear subspace by using the eigenvalue decomposition without any iterative procedures, and thus is computationally reasonable. Furthermore, through theoretical analysis, we prove that the identified subspace converges to the true subspace at the optimal parametric rate. Finally, the practical performance of the proposed algorithm is demonstrated on both artificial and benchmark datasets. ER -
APA
Sasaki, H., Niu, G. & Sugiyama, M.. (2016). Non-Gaussian Component Analysis with Log-Density Gradient Estimation. Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, in PMLR 51:1177-1185

Related Material