FG-MSTGNN: Cross-subject EEG Emotion Recognition via Frequency-guided Multi-period Spatial-temporal Graph Neural Network

Chenchen Zhang, YANRONG HAO, Xin Wen, Mengni Zhou, Fei Yuan, Rui Cao
Proceedings of the 17th Asian Conference on Machine Learning, PMLR 304:1102-1117, 2025.

Abstract

Accurate decoding of emotional EEG signals constitutes a critical challenge for developing affective brain-computer interfaces. Contemporary methods for cross-subject EEG-based emotion recognition confront two critical challenges: 1) inadequate investigation of the distinct affective features of the EEG rhythm; 2) insufficient capability to extract the various neurophysiological connectivity patterns across subjects in the same experimental setting. To address these limitations, we propose FG-MSTGNN, a dual-stage adaptive learning framework comprising the Frequency-guided Multi-period Spatial-temporal Graph Neural Network. The Feature Learning Stage utilizes a Multi-period Time-Frequency Cooperative Encoder Module to hierarchically extract cross-frequency rhythmic dynamics. The Topology Optimization Stage utilizes a Dual-Phase Graph Pooling Module to dynamically generate personalized sparse neurophysiological connectivity patterns. Systematic evaluation under cross-subject experiments demonstrates the framework achieves average classification accuracies of 94.67% and 85.28% on SEED and SEED-IV respectively, showing statistically distinctive improvements over state-of-the-art EEG emotion recognition methods. The proposed framework reveals that both functional brain network topology and EEG spectral dynamics varies from different emotional states.

Cite this Paper


BibTeX
@InProceedings{pmlr-v304-zhang25c, title = {FG-MSTGNN: Cross-subject EEG Emotion Recognition via Frequency-guided Multi-period Spatial-temporal Graph Neural Network}, author = {Zhang, Chenchen and HAO, YANRONG and Wen, Xin and Zhou, Mengni and Yuan, Fei and Cao, Rui}, booktitle = {Proceedings of the 17th Asian Conference on Machine Learning}, pages = {1102--1117}, year = {2025}, editor = {Lee, Hung-yi and Liu, Tongliang}, volume = {304}, series = {Proceedings of Machine Learning Research}, month = {09--12 Dec}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v304/main/assets/zhang25c/zhang25c.pdf}, url = {https://proceedings.mlr.press/v304/zhang25c.html}, abstract = {Accurate decoding of emotional EEG signals constitutes a critical challenge for developing affective brain-computer interfaces. Contemporary methods for cross-subject EEG-based emotion recognition confront two critical challenges: 1) inadequate investigation of the distinct affective features of the EEG rhythm; 2) insufficient capability to extract the various neurophysiological connectivity patterns across subjects in the same experimental setting. To address these limitations, we propose FG-MSTGNN, a dual-stage adaptive learning framework comprising the Frequency-guided Multi-period Spatial-temporal Graph Neural Network. The Feature Learning Stage utilizes a Multi-period Time-Frequency Cooperative Encoder Module to hierarchically extract cross-frequency rhythmic dynamics. The Topology Optimization Stage utilizes a Dual-Phase Graph Pooling Module to dynamically generate personalized sparse neurophysiological connectivity patterns. Systematic evaluation under cross-subject experiments demonstrates the framework achieves average classification accuracies of 94.67% and 85.28% on SEED and SEED-IV respectively, showing statistically distinctive improvements over state-of-the-art EEG emotion recognition methods. The proposed framework reveals that both functional brain network topology and EEG spectral dynamics varies from different emotional states.} }
Endnote
%0 Conference Paper %T FG-MSTGNN: Cross-subject EEG Emotion Recognition via Frequency-guided Multi-period Spatial-temporal Graph Neural Network %A Chenchen Zhang %A YANRONG HAO %A Xin Wen %A Mengni Zhou %A Fei Yuan %A Rui Cao %B Proceedings of the 17th Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2025 %E Hung-yi Lee %E Tongliang Liu %F pmlr-v304-zhang25c %I PMLR %P 1102--1117 %U https://proceedings.mlr.press/v304/zhang25c.html %V 304 %X Accurate decoding of emotional EEG signals constitutes a critical challenge for developing affective brain-computer interfaces. Contemporary methods for cross-subject EEG-based emotion recognition confront two critical challenges: 1) inadequate investigation of the distinct affective features of the EEG rhythm; 2) insufficient capability to extract the various neurophysiological connectivity patterns across subjects in the same experimental setting. To address these limitations, we propose FG-MSTGNN, a dual-stage adaptive learning framework comprising the Frequency-guided Multi-period Spatial-temporal Graph Neural Network. The Feature Learning Stage utilizes a Multi-period Time-Frequency Cooperative Encoder Module to hierarchically extract cross-frequency rhythmic dynamics. The Topology Optimization Stage utilizes a Dual-Phase Graph Pooling Module to dynamically generate personalized sparse neurophysiological connectivity patterns. Systematic evaluation under cross-subject experiments demonstrates the framework achieves average classification accuracies of 94.67% and 85.28% on SEED and SEED-IV respectively, showing statistically distinctive improvements over state-of-the-art EEG emotion recognition methods. The proposed framework reveals that both functional brain network topology and EEG spectral dynamics varies from different emotional states.
APA
Zhang, C., HAO, Y., Wen, X., Zhou, M., Yuan, F. & Cao, R.. (2025). FG-MSTGNN: Cross-subject EEG Emotion Recognition via Frequency-guided Multi-period Spatial-temporal Graph Neural Network. Proceedings of the 17th Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 304:1102-1117 Available from https://proceedings.mlr.press/v304/zhang25c.html.

Related Material