Sparse Principal Component Analysis for High Dimensional Multivariate Time Series

Zhaoran Wang, Fang Han, Han Liu
Proceedings of the Sixteenth International Conference on Artificial Intelligence and Statistics, PMLR 31:48-56, 2013.

Abstract

We study sparse principal component analysis (sparse PCA) for high dimensional multivariate vector autoregressive (VAR) time series. By treating the transition matrix as a nuisance parameter, we show that sparse PCA can be directly applied on analyzing multivariate time series as if the data are i.i.d. generated. Under a double asymptotic framework in which both the length of the sample period T and dimensionality d of the time series can increase (with possibly d≫T), we provide explicit rates of convergence of the angle between the estimated and population leading eigenvectors of the time series covariance matrix. Our results suggest that the spectral norm of the transition matrix plays a pivotal role in determining the final rates of convergence. Implications of such a general result is further illustrated using concrete examples. The results of this paper have impacts on different applications, including financial time series, biomedical imaging, and social media, etc.

Cite this Paper


BibTeX
@InProceedings{pmlr-v31-wang13a, title = {Sparse Principal Component Analysis for High Dimensional Multivariate Time Series}, author = {Wang, Zhaoran and Han, Fang and Liu, Han}, booktitle = {Proceedings of the Sixteenth International Conference on Artificial Intelligence and Statistics}, pages = {48--56}, year = {2013}, editor = {Carvalho, Carlos M. and Ravikumar, Pradeep}, volume = {31}, series = {Proceedings of Machine Learning Research}, address = {Scottsdale, Arizona, USA}, month = {29 Apr--01 May}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v31/wang13a.pdf}, url = {https://proceedings.mlr.press/v31/wang13a.html}, abstract = {We study sparse principal component analysis (sparse PCA) for high dimensional multivariate vector autoregressive (VAR) time series. By treating the transition matrix as a nuisance parameter, we show that sparse PCA can be directly applied on analyzing multivariate time series as if the data are i.i.d. generated. Under a double asymptotic framework in which both the length of the sample period T and dimensionality d of the time series can increase (with possibly d≫T), we provide explicit rates of convergence of the angle between the estimated and population leading eigenvectors of the time series covariance matrix. Our results suggest that the spectral norm of the transition matrix plays a pivotal role in determining the final rates of convergence. Implications of such a general result is further illustrated using concrete examples. The results of this paper have impacts on different applications, including financial time series, biomedical imaging, and social media, etc.}, note = {Notable paper award} }
Endnote
%0 Conference Paper %T Sparse Principal Component Analysis for High Dimensional Multivariate Time Series %A Zhaoran Wang %A Fang Han %A Han Liu %B Proceedings of the Sixteenth International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2013 %E Carlos M. Carvalho %E Pradeep Ravikumar %F pmlr-v31-wang13a %I PMLR %P 48--56 %U https://proceedings.mlr.press/v31/wang13a.html %V 31 %X We study sparse principal component analysis (sparse PCA) for high dimensional multivariate vector autoregressive (VAR) time series. By treating the transition matrix as a nuisance parameter, we show that sparse PCA can be directly applied on analyzing multivariate time series as if the data are i.i.d. generated. Under a double asymptotic framework in which both the length of the sample period T and dimensionality d of the time series can increase (with possibly d≫T), we provide explicit rates of convergence of the angle between the estimated and population leading eigenvectors of the time series covariance matrix. Our results suggest that the spectral norm of the transition matrix plays a pivotal role in determining the final rates of convergence. Implications of such a general result is further illustrated using concrete examples. The results of this paper have impacts on different applications, including financial time series, biomedical imaging, and social media, etc. %Z Notable paper award
RIS
TY - CPAPER TI - Sparse Principal Component Analysis for High Dimensional Multivariate Time Series AU - Zhaoran Wang AU - Fang Han AU - Han Liu BT - Proceedings of the Sixteenth International Conference on Artificial Intelligence and Statistics DA - 2013/04/29 ED - Carlos M. Carvalho ED - Pradeep Ravikumar ID - pmlr-v31-wang13a PB - PMLR DP - Proceedings of Machine Learning Research VL - 31 SP - 48 EP - 56 L1 - http://proceedings.mlr.press/v31/wang13a.pdf UR - https://proceedings.mlr.press/v31/wang13a.html AB - We study sparse principal component analysis (sparse PCA) for high dimensional multivariate vector autoregressive (VAR) time series. By treating the transition matrix as a nuisance parameter, we show that sparse PCA can be directly applied on analyzing multivariate time series as if the data are i.i.d. generated. Under a double asymptotic framework in which both the length of the sample period T and dimensionality d of the time series can increase (with possibly d≫T), we provide explicit rates of convergence of the angle between the estimated and population leading eigenvectors of the time series covariance matrix. Our results suggest that the spectral norm of the transition matrix plays a pivotal role in determining the final rates of convergence. Implications of such a general result is further illustrated using concrete examples. The results of this paper have impacts on different applications, including financial time series, biomedical imaging, and social media, etc. N1 - Notable paper award ER -
APA
Wang, Z., Han, F. & Liu, H.. (2013). Sparse Principal Component Analysis for High Dimensional Multivariate Time Series. Proceedings of the Sixteenth International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 31:48-56 Available from https://proceedings.mlr.press/v31/wang13a.html. Notable paper award

Related Material