SJTU Emotion EEG Dataset(SEED), is a collection of EEG dataset provided by the BCMI laboratory which is led by Prof. Bao-Liang Lu. The name is inherited from the first version of the dataset, but now we provide not only emotion, but also vigilance dataset. If you are interested about the datasets, take a look at the download page.
The SEED dataset contains subjects' EEG signals when they were watching films clips. The film clips are carefully selected so as to induce different types of emotion, which are positive, negative, and neutral ones. Click here to know details about the dataset.
The SEED-IV is an evolution of the original SEED dataset. The category number of emotion change to four: happy, sad, fear, and neutral. In SEED-IV, we provide not only EEG signals, but also eye movement features recorded by SMI eye-tracking glasses, which makes it an well formed multi-modal dataset for emotion recognition. Click here to know details about the dataset.
The SEED-VIG dataset is orientated at exploring the vigilance estimation problem. We built a virtual driving system, in which a huge screen is placed in front of a real car. Subjects can play a driving game in the car, just as if driving in the real-world environment. The SEED-VIG dataset is collected when the subjects driving in the system. The vigilance level is labelled with the PERCLOS indicator by SMI eye-tracking glasses. Click here to know details about the dataset.
This work was supported in part by the grants from the the National Key Research and Development Program of China (Grant No. 2017YFB1002501), the National Natural Science Foundation of China (Grant No. 61272248 and No. 61673266), the National Basic Research Program of China (Grant No. 2013CB329401), the Science and Technology Commission of Shanghai Municipality (Grant No.13511500200), the Open Funding Project of National Key Laboratory of Human Factors Engineering (Grant No. HF2012-K-01), the Fundamental Research Funds for the Central Universities, and the European Union Seventh Framework Program (Grant No.247619).