Forecastable Component Analysis

Georg Goerg
; Proceedings of the 30th International Conference on Machine Learning, PMLR 28(2):64-72, 2013.

Abstract

I introduce Forecastable Component Analysis (ForeCA), a novel dimension reduction technique for temporally dependent signals. Based on a new forecastability measure, ForeCA finds an optimal transformation to separate a multivariate time series into a forecastable and an orthogonal white noise space. I present a converging algorithm with a fast eigenvector solution. Applications to financial and macro-economic time series show that ForeCA can successfully discover informative structure, which can be used for forecasting as well as classification. The R package ForeCA accompanies this work and is publicly available on CRAN.

Cite this Paper


BibTeX
@InProceedings{pmlr-v28-goerg13, title = {Forecastable Component Analysis}, author = {Georg Goerg}, booktitle = {Proceedings of the 30th International Conference on Machine Learning}, pages = {64--72}, year = {2013}, editor = {Sanjoy Dasgupta and David McAllester}, volume = {28}, number = {2}, series = {Proceedings of Machine Learning Research}, address = {Atlanta, Georgia, USA}, month = {17--19 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v28/goerg13.pdf}, url = {http://proceedings.mlr.press/v28/goerg13.html}, abstract = {I introduce Forecastable Component Analysis (ForeCA), a novel dimension reduction technique for temporally dependent signals. Based on a new forecastability measure, ForeCA finds an optimal transformation to separate a multivariate time series into a forecastable and an orthogonal white noise space. I present a converging algorithm with a fast eigenvector solution. Applications to financial and macro-economic time series show that ForeCA can successfully discover informative structure, which can be used for forecasting as well as classification. The R package ForeCA accompanies this work and is publicly available on CRAN.} }
Endnote
%0 Conference Paper %T Forecastable Component Analysis %A Georg Goerg %B Proceedings of the 30th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2013 %E Sanjoy Dasgupta %E David McAllester %F pmlr-v28-goerg13 %I PMLR %J Proceedings of Machine Learning Research %P 64--72 %U http://proceedings.mlr.press %V 28 %N 2 %W PMLR %X I introduce Forecastable Component Analysis (ForeCA), a novel dimension reduction technique for temporally dependent signals. Based on a new forecastability measure, ForeCA finds an optimal transformation to separate a multivariate time series into a forecastable and an orthogonal white noise space. I present a converging algorithm with a fast eigenvector solution. Applications to financial and macro-economic time series show that ForeCA can successfully discover informative structure, which can be used for forecasting as well as classification. The R package ForeCA accompanies this work and is publicly available on CRAN.
RIS
TY - CPAPER TI - Forecastable Component Analysis AU - Georg Goerg BT - Proceedings of the 30th International Conference on Machine Learning PY - 2013/02/13 DA - 2013/02/13 ED - Sanjoy Dasgupta ED - David McAllester ID - pmlr-v28-goerg13 PB - PMLR SP - 64 DP - PMLR EP - 72 L1 - http://proceedings.mlr.press/v28/goerg13.pdf UR - http://proceedings.mlr.press/v28/goerg13.html AB - I introduce Forecastable Component Analysis (ForeCA), a novel dimension reduction technique for temporally dependent signals. Based on a new forecastability measure, ForeCA finds an optimal transformation to separate a multivariate time series into a forecastable and an orthogonal white noise space. I present a converging algorithm with a fast eigenvector solution. Applications to financial and macro-economic time series show that ForeCA can successfully discover informative structure, which can be used for forecasting as well as classification. The R package ForeCA accompanies this work and is publicly available on CRAN. ER -
APA
Goerg, G.. (2013). Forecastable Component Analysis. Proceedings of the 30th International Conference on Machine Learning, in PMLR 28(2):64-72

Related Material