Statistical Analysis of Karcher Means for Random Restricted PSD Matrices

Hengchao Chen, Xiang Li, Qiang Sun
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:1437-1456, 2023.

Abstract

Non-asymptotic statistical analysis is often missing for modern geometry-aware machine learning algorithms due to the possibly intricate non-linear manifold structure. This paper studies an intrinsic mean model on the manifold of restricted positive semi-definite matrices and provides a non-asymptotic statistical analysis of the Karcher mean. We also consider a general extrinsic signal-plus-noise model, under which a deterministic error bound of the Karcher mean is provided. As an application, we show that the distributed principal component analysis algorithm, LRC-dPCA, achieves the same performance as the full sample PCA algorithm. Numerical experiments lend strong support to our theories.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-chen23a, title = {Statistical Analysis of Karcher Means for Random Restricted PSD Matrices}, author = {Chen, Hengchao and Li, Xiang and Sun, Qiang}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {1437--1456}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/chen23a/chen23a.pdf}, url = {https://proceedings.mlr.press/v206/chen23a.html}, abstract = {Non-asymptotic statistical analysis is often missing for modern geometry-aware machine learning algorithms due to the possibly intricate non-linear manifold structure. This paper studies an intrinsic mean model on the manifold of restricted positive semi-definite matrices and provides a non-asymptotic statistical analysis of the Karcher mean. We also consider a general extrinsic signal-plus-noise model, under which a deterministic error bound of the Karcher mean is provided. As an application, we show that the distributed principal component analysis algorithm, LRC-dPCA, achieves the same performance as the full sample PCA algorithm. Numerical experiments lend strong support to our theories.} }
Endnote
%0 Conference Paper %T Statistical Analysis of Karcher Means for Random Restricted PSD Matrices %A Hengchao Chen %A Xiang Li %A Qiang Sun %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-chen23a %I PMLR %P 1437--1456 %U https://proceedings.mlr.press/v206/chen23a.html %V 206 %X Non-asymptotic statistical analysis is often missing for modern geometry-aware machine learning algorithms due to the possibly intricate non-linear manifold structure. This paper studies an intrinsic mean model on the manifold of restricted positive semi-definite matrices and provides a non-asymptotic statistical analysis of the Karcher mean. We also consider a general extrinsic signal-plus-noise model, under which a deterministic error bound of the Karcher mean is provided. As an application, we show that the distributed principal component analysis algorithm, LRC-dPCA, achieves the same performance as the full sample PCA algorithm. Numerical experiments lend strong support to our theories.
APA
Chen, H., Li, X. & Sun, Q.. (2023). Statistical Analysis of Karcher Means for Random Restricted PSD Matrices. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:1437-1456 Available from https://proceedings.mlr.press/v206/chen23a.html.

Related Material