[edit]
FeelNet: A Lightweight Fast Fourier Transform EEG-based Emotion Recognition Network
Proceedings of the 17th Asian Conference on Machine Learning, PMLR 304:479-494, 2025.
Abstract
Emotion recognition using Electroencephalography (EEG) is challenging due to its low signal-to-noise ratios and high-dimensional sparsity. We propose FeelNet, a novel Fast Fourier Transform (FFT)-based architecture that simultaneously extracts global and local features across joint frequency-time domains. FeelNet incorporates an adaptive Rhythm Spectral Block (RSB) for capturing key frequency patterns and filtering task-irrelevant noise through power spectral thresholding. Additionally, the Multi-scale Temporal Conv Block (MTCB) enhances the model’s ability to decode complex temporal dynamics. Extensive evaluations on the DEAP and DREAMER datasets demonstrate that FeelNet outperforms existing state-of-the-art methods in accuracy and flexibility, even under noise-contaminated conditions. Owing to its computational efficiency and noise resilience, FeelNet provides an alternative perspective for EEG-based affective computing.