Distillation-Based Domain Generalization for Cross-Dataset EEG-Based Emotion Recognition
Electroencephalogram (EEG)-based emotion recognition has gradually become a research hotspot with extensive real-world applications. Differences in EEG signals across subjects usually lead to the unsatisfactory performance in subject-independent emotion recognition. To handle this challenge, many researchers have paid attention to the development of transfer learning techniques, which yield promising results. Currently, most researchers focus on transfer learning between subjects within one single dataset. However, cross-dataset transfer learning presents a great challenge, because, in this case, the data collected from different environments and equipments have much more severe variations. Domain Generalization (DG) has great potential to handle the unseen data without involving them in training. Besides, Knowledge Distillation (KD), which can transfer the knowledge learned from the teacher to the student model, has shown promise in generalization for the student model. Inspired by DG and KD, we propose a novel and effective method, Distillation-Based Domain Generalization (DBDG), for cross-dataset EEG-based emotion recognition. Specifically, DBDG contains the modules of feature extraction, online distillation and self-distillation. The feature extraction module can learn the discriminative emotional features from EEG signals; the online distillation and self-distillation modules can enhance the method generalizability. Experimental results on three public benchmark datasets, SEED, SEED-IV and DEAP, have demonstrated the effectiveness of our method for cross-dataset EEG-based emotion recognition.
Duke Scholars
Published In
DOI
EISSN
Publication Date
Related Subject Headings
- 4611 Machine learning
- 4603 Computer vision and multimedia computation
Citation
Published In
DOI
EISSN
Publication Date
Related Subject Headings
- 4611 Machine learning
- 4603 Computer vision and multimedia computation