SEED Dataset
A dataset collection for various purposes using EEG signals
Experiment Setup
The emotion induction method adopted in this experiment is stimulus material induction, in other words, it is generated by allowing the subjects to watch specific emotional stimulus materials to achieve the purpose of inducing the subjects' corresponding emotional state.Subjects
This experiment collected data from 20 subjects, including 10 males and 10 females. All subjects were recruited through social platforms and were students of Shanghai Jiao Tong University with normal hearing and vision, a right-handed dominant hand, and a stable mental state. Subjects will receive an Eysenck EPQ personality test after expressing their intention to participate in the experiment through social platforms.Feature Extraction
EEG Features
For EEG signal processing, the raw EEG data are first downsampled to a 200 Hz sampling rate. To filter the noise and remove the artifacts, the EEG data are then processed with a bandpass filter between 1 Hz and 75 Hz. Afterward, we extract differential entropy (DE) features within each segment at 5 frequency bands: 1) delta: 1~4 Hz; 2) theta: 4~8 Hz; 3) alpha: 8~14 Hz; 4) beta: 14~31 Hz; and 5) gamma: 31~50 Hz.Eye Movement Features
For the eye movement information collected with the SMI eye tracking glasses, we extracted various features from different detailed parameters used in the literature, such as the pupil diameter, fixation, saccade, and blink. A detailed list of eye movement features is shown below.Dataset Summary
- Folder EEG_DE_features: This folder contains DE features of 16 participants and a Jupyter notebook file giving example code to load the data. Feature data are named in "subjectID_sessionID.npz". For example, file "1_123.npz" means that this file is the DE feature for the first subject, and the DE features from three sessions are included.
- Folder EEG_raw: This folder contains raw EEG data (.cnt file) collected with the Neuroscan device and a Jupyter notebook file giving example code to load the data. Raw data are named in "subjectID_sessionID_date.cnt" format. For example, "1_1_20180804.cnt" means that this is the raw data of the first subject and first session. **Notice: session numbers are given based on the stimuli materials watched, not based on the time. **
- Folder Eye_movement_features: This folder contains extracted eye movement features.
- Folder Eye_raw: This folder contains Excel files extracted from the eye tracking device.
-
Folder src:
This folder contains two subfolders "DCCA-with-attention-mechanism" and
"MultualInformationComparison", which are DCCA models and mutual information calculating code used in the paper "W. Liu, J. -L. Qiu, W. -L. Zheng and B. -L. Lu, "Comparing Recognition Performance and Robustness of Multimodal Deep Learning Models for Multimodal Emotion Recognition," in IEEE Transactions on Cognitive and Developmental Systems, doi: 10.1109/TCDS.2021.3071170." - File trial_start_end_timestamp.txt: This file contains the start and end times of the stimulus movie clips.
- File emotion_label_and_stimuli_order.xlsx: This file contains the emotion labels and stimuli orders.
- File Participants_info.xlsx: This file contains meta-information of the subjects.
- File Scores.xlsx: This file contains feedback scores of the participants.
Download
Download SEED-VReference
If you feel that the dataset is helpful for your study, please add the following references to your publications.
Wei Liu, Jie-Lin Qiu, Wei-Long Zheng and Bao-Liang Lu, Comparing Recognition Performance and Robustness of Multimodal Deep Learning Models for Multimodal Emotion Recognition, IEEE Transactions on Cognitive and Developmental Systems, 2021. [link] [BibTex]