Welcome to the SJTU Emotion EEG Dataset (SEED)

NEWS: A multimodal dataset of EEG and eye movements for four emotions (happy, neutral, sad, and fear) called (SEED-IV) is released (August 2018). The download access can be obtained with request to Administrator. For details, please refer to the following paper:

Wei-Long Zheng, Wei Liu, Yifei Lu, Bao-Liang Lu, and Andrzej Cichocki, EmotionMeter: A Multimodal Framework for Recognizing Human Emotions. IEEE Transactions on Cybernetics, 2018.

NEWS: A Multimodal Dataset with EEG and forehead EOG for Vigilance Estimation (SEED-VIG) is released (April 2017), which is published in the following paper:

Wei-Long Zheng and Bao-Liang Lu, A multimodal approach to estimating vigilance using EEG and forehead EOG. Journal of Neural Engineering, 14(2): 026017, 2017. [link]

For more details, please refer to the DOWNLOAD page and the PUBLICATION page.

SEED: We develop an EEG dataset for emotion recognition called SJTU Emotion EEG Dataset (SEED). The EEG siganls of 15 subjects were recorded while they were watching emotional flim clips. For the feedback, participants were told to report their emotional reactions to each film clip by completing the questionnaire immediately after watching each clip. In order to investigate neural signatures and stable patterns across sessions and individuals, each subject was required to perform the experiments for three sessions. The time interval between two sessions was one week or longer. Facial videos and EEG data were recorded simultaneously. EEG was recorded using an ESI NeuroScan System at a sampling rate of 1000 Hz from 62-channel active AgCl electrode cap according to the international 10-20 system.

The dataset is made publicly avaliable to research community and reseachers are encouraged to validate their emotion analysis methods on our dataset. The dataste was fully described in the following papers:

Wei-Long Zheng, and Bao-Liang Lu, Investigating Critical Frequency Bands and Channels for EEG-based Emotion Recognition with Deep Neural Networks", IEEE Transactions on Autonomous Mental Development (IEEE TAMD) 7(3): 162-175, 2015. [link] [BibTex]

Wei-Long Zheng, Jia-Yi Zhu, and Bao-Liang Lu, Identifying Stable Patterns over Time for Emotion Recognition from EEG, to appear in IEEE Transactions on Affective Computing, 2017. [link]

Yimin Yang, Q.M.Jonathan Wu, Wei-Long Zheng, Bao-Liang Lu. EEG-based Emotion Recognition Using Hierarchical Network with Subnetwork Nodes. IEEE Transactions on Cognitive and Developmental Systems, 2017. [link]

NEWS: Until July 2018, there have been more than 320 applications from all over the world applying the usage of SEED in their studies.

How to Download

If you are interested to use our dataset in your research, you will have to print, sign and scan a license agreement and return it via email. We will then send you a username and a password to download the data. Please refer to the Download page for more details.


This work was supported in part by the grants from the National Natural Science Foundation of China (Grant No. 61272248), the National Basic Research Program of China (Grant No. 2013CB329401), the Science and Technology Commission of Shanghai Municipality (Grant No.13511500200), the Open Funding Project of National Key Laboratory of Human Factors Engineering (Grant No. HF2012-K-01), and the European Union Seventh Framework Program (Grant No.247619).