[edit]
FG-MSTGNN: Cross-subject EEG Emotion Recognition via Frequency-guided Multi-period Spatial-temporal Graph Neural Network
Proceedings of the 17th Asian Conference on Machine Learning, PMLR 304:1102-1117, 2025.
Abstract
Accurate decoding of emotional EEG signals constitutes a critical challenge for developing affective brain-computer interfaces. Contemporary methods for cross-subject EEG-based emotion recognition confront two critical challenges: 1) inadequate investigation of the distinct affective features of the EEG rhythm; 2) insufficient capability to extract the various neurophysiological connectivity patterns across subjects in the same experimental setting. To address these limitations, we propose FG-MSTGNN, a dual-stage adaptive learning framework comprising the Frequency-guided Multi-period Spatial-temporal Graph Neural Network. The Feature Learning Stage utilizes a Multi-period Time-Frequency Cooperative Encoder Module to hierarchically extract cross-frequency rhythmic dynamics. The Topology Optimization Stage utilizes a Dual-Phase Graph Pooling Module to dynamically generate personalized sparse neurophysiological connectivity patterns. Systematic evaluation under cross-subject experiments demonstrates the framework achieves average classification accuracies of 94.67% and 85.28% on SEED and SEED-IV respectively, showing statistically distinctive improvements over state-of-the-art EEG emotion recognition methods. The proposed framework reveals that both functional brain network topology and EEG spectral dynamics varies from different emotional states.