[edit]
Multivariate Time Series Forecasting By Graph Attention Networks With Theoretical Guarantees
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:2845-2853, 2024.
Abstract
Multivariate time series forecasting (MTSF) aims to predict future values of multiple variables based on past values of multivariate time series, and has been applied in fields including traffic flow prediction, stock price forecasting, and anomaly detection. Capturing the inter-dependencies among multiple series poses one significant challenge to MTSF. Recent works have considered modeling the correlated series as graph nodes and using graph neural network (GNN)-based approaches with attention mechanisms added to improve the test prediction accuracy, however, none of them have theoretical guarantees regarding the generalization error. In this paper, we develop a new norm-bounded graph attention network (GAT) for MTSF by upper-bounding the Frobenius norm of weights in each layer of the GAT model to enhance performance. We theoretically establish that the generalization error bound for our model is associated with various components of GAT models: the number of attention heads, the maximum number of neighbors, the upper bound of the Frobenius norm of the weight matrix in each layer, and the norm of the input features. Empirically, we investigate the impact of different components of GAT models on the generalization performance of MTSF on real data. Our experiment verifies our theoretical findings. We compare with multiple prior frequently cited graph-based methods for MTSF using real data sets and the experiment results show our method can achieve the best performance for MTSF. Our method provides novel perspectives for improving the generalization performance of MTSF, and our theoretical guarantees give substantial implications for designing graph-based methods with attention mechanisms for MTSF.