Skip to main content
eScholarship
Open Access Publications from the University of California

Cross-subject EEG Emotion Recognition based on Multitask Adversarial Domain Adaption

Abstract

Emotion recognition is crucial for enhancing human-computer interaction. Due to considerable individual differences in emotion manifestation, traditional models do not adapt well to new individuals. Moreover, existing algorithms typically focus on identifying a single emotion, overlooking intrinsic connections among multiple emotions. Therefore, we propose a multi-task adversarial domain adaption (MADA) model for EEG-based emotion recognition. First, domain matching is employed to identify the most similar individual from the dataset as the source domain, alleviating individual differences and reducing training time. Subsequently, multi-task learning is utilized to simultaneously classify multiple emotions, capturing their intrinsic connections. Finally, adversarial domain adaption is applied to learn the individual differences between the source and target domains. Cross-subject experiments on the DEAP dataset indicate that our model achieves accuracies of 78.08%, 68.36%, and 69.64% on the valence, arousal, and dominance, respectively, surpassing state-of-the-art methods. This indicates the effectiveness of our model in recognizing multi-dimensional emotions.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View