Welcome to the SJTU Emotion EEG Dataset (SEED)

NEWS: A Multimodal Dataset with EEG and forehead EOG for Vigilance Estimation (SEED-VIG) is released (April 2017), which is published in the following paper:

Wei-Long Zheng and Bao-Liang Lu, A multimodal approach to estimating vigilance using EEG and forehead EOG. Journal of Neural Engineering, 14(2): 026017, 2017. [link]

For more details, please refer to the DOWNLOAD page and the PUBLICATION page.

SEED: We develop an EEG dataset for emotion recognition called SJTU Emotion EEG Dataset (SEED). The EEG siganls of 15 subjects were recorded while they were watching emotional flim clips. For the feedback, participants were told to report their emotional reactions to each film clip by completing the questionnaire immediately after watching each clip. In order to investigate neural signatures and stable patterns across sessions and individuals, each subject was required to perform the experiments for three sessions. The time interval between two sessions was one week or longer. Facial videos and EEG data were recorded simultaneously. EEG was recorded using an ESI NeuroScan System at a sampling rate of 1000 Hz from 62-channel active AgCl electrode cap according to the international 10-20 system.

The dataset is made publicly avaliable to research community and reseachers are encouraged to validate their emotion analysis methods on our dataset. The dataste was first fully described in the following paper:

Investigating Critical Frequency Bands and Channels for EEG-based Emotion Recognition with Deep Neural Networks", Wei-Long Zheng, and Bao-Liang Lu, IEEE Transactions on Autonomous Mental Development (IEEE TAMD) 7(3): 162-175, 2015. [link] [BibTex]

Wei-Long Zheng, Jia-Yi Zhu, and Bao-Liang Lu, Identifying Stable Patterns over Time for Emotion Recognition from EEG, to appear in IEEE Transactions on Affective Computing, 2017. [link]

Until June 2017, there have been 170 applications from 33 different countries and regions from all over the world applying the usage of SEED in their studies.

How to Download

If you are interested to use our dataset in your research, you will have to print, sign and scan a license agreement and return it via email. We will then send you a username and a password to download the data. Please refer to the Download page for more details.

Acknowledgments

This work was supported in part by the grants from the National Natural Science Foundation of China (Grant No. 61272248), the National Basic Research Program of China (Grant No. 2013CB329401), the Science and Technology Commission of Shanghai Municipality (Grant No.13511500200), the Open Funding Project of National Key Laboratory of Human Factors Engineering (Grant No. HF2012-K-01), and the European Union Seventh Framework Program (Grant No.247619).