Spectral Differential Network Analysis for High-Dimensional Time Series

Michael Hellstern, Byol Kim, Zaid Harchaoui, Ali Shojaie
Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, PMLR 258:2512-2520, 2025.

Abstract

Spectral networks derived from multivariate time series data arise in many domains, from brain science to Earth science. Often, it is of interest to study how these networks change under different conditions. For instance, to better understand epilepsy, it would be interesting to capture the changes in the brain connectivity network as a patient experiences a seizure, using electroencephalography data. A common approach relies on estimating the networks in each condition and calculating their difference. Such estimates may behave poorly in high dimensions as the networks themselves may not be sparse in structure while their difference may be. We build upon this observation to develop an estimator of the difference in inverse spectral densities across two conditions. Using an l1 penalty on the difference, consistency is established by only requiring the difference to be sparse. We illustrate the method on synthetic data experiments and on experiments with electroencephalography data.

Cite this Paper


BibTeX
@InProceedings{pmlr-v258-hellstern25a, title = {Spectral Differential Network Analysis for High-Dimensional Time Series}, author = {Hellstern, Michael and Kim, Byol and Harchaoui, Zaid and Shojaie, Ali}, booktitle = {Proceedings of The 28th International Conference on Artificial Intelligence and Statistics}, pages = {2512--2520}, year = {2025}, editor = {Li, Yingzhen and Mandt, Stephan and Agrawal, Shipra and Khan, Emtiyaz}, volume = {258}, series = {Proceedings of Machine Learning Research}, month = {03--05 May}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v258/main/assets/hellstern25a/hellstern25a.pdf}, url = {https://proceedings.mlr.press/v258/hellstern25a.html}, abstract = {Spectral networks derived from multivariate time series data arise in many domains, from brain science to Earth science. Often, it is of interest to study how these networks change under different conditions. For instance, to better understand epilepsy, it would be interesting to capture the changes in the brain connectivity network as a patient experiences a seizure, using electroencephalography data. A common approach relies on estimating the networks in each condition and calculating their difference. Such estimates may behave poorly in high dimensions as the networks themselves may not be sparse in structure while their difference may be. We build upon this observation to develop an estimator of the difference in inverse spectral densities across two conditions. Using an l1 penalty on the difference, consistency is established by only requiring the difference to be sparse. We illustrate the method on synthetic data experiments and on experiments with electroencephalography data.} }
Endnote
%0 Conference Paper %T Spectral Differential Network Analysis for High-Dimensional Time Series %A Michael Hellstern %A Byol Kim %A Zaid Harchaoui %A Ali Shojaie %B Proceedings of The 28th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2025 %E Yingzhen Li %E Stephan Mandt %E Shipra Agrawal %E Emtiyaz Khan %F pmlr-v258-hellstern25a %I PMLR %P 2512--2520 %U https://proceedings.mlr.press/v258/hellstern25a.html %V 258 %X Spectral networks derived from multivariate time series data arise in many domains, from brain science to Earth science. Often, it is of interest to study how these networks change under different conditions. For instance, to better understand epilepsy, it would be interesting to capture the changes in the brain connectivity network as a patient experiences a seizure, using electroencephalography data. A common approach relies on estimating the networks in each condition and calculating their difference. Such estimates may behave poorly in high dimensions as the networks themselves may not be sparse in structure while their difference may be. We build upon this observation to develop an estimator of the difference in inverse spectral densities across two conditions. Using an l1 penalty on the difference, consistency is established by only requiring the difference to be sparse. We illustrate the method on synthetic data experiments and on experiments with electroencephalography data.
APA
Hellstern, M., Kim, B., Harchaoui, Z. & Shojaie, A.. (2025). Spectral Differential Network Analysis for High-Dimensional Time Series. Proceedings of The 28th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 258:2512-2520 Available from https://proceedings.mlr.press/v258/hellstern25a.html.

Related Material