101. M3CV: A multi-subject, multi-session, and multi-task database for EEG-based biometrics challenge
- Author
-
Gan Huang, Zhenxing Hu, Weize Chen, Shaorong Zhang, Zhen Liang, Linling Li, Li Zhang, and Zhiguo Zhang
- Subjects
Neurosciences. Biological psychiatry. Neuropsychiatry ,RC321-571 - Abstract
EEG signals exhibit commonality and variability across subjects, sessions, and tasks. But most existing EEG studies focus on mean group effects (commonality) by averaging signals over trials and subjects. The substantial intra- and inter-subject variability of EEG have often been overlooked. The recently significant technological advances in machine learning, especially deep learning, have brought technological innovations to EEG signal application in many aspects, but there are still great challenges in cross-session, cross-task, and cross-subject EEG decoding. In this work, an EEG-based biometric competition based on a large-scale M3CV (A Multi-subject, Multi-session, and Multi-task Database for investigation of EEG Commonality and Variability) database was launched to better characterize and harness the intra- and inter-subject variability and promote the development of machine learning algorithm in this field. In the M3CV database, EEG signals were recorded from 106 subjects, of which 95 subjects repeated two sessions of the experiments on different days. The whole experiment consisted of 6 paradigms, including resting-state, transient-state sensory, steady-state sensory, cognitive oddball, motor execution, and steady-state sensory with selective attention with 14 types of EEG signals, 120000 epochs. Two learning tasks (identification and verification), performance metrics, and baseline methods were introduced in the competition. In general, the proposed M3CV dataset and the EEG-based biometric competition aim to provide the opportunity to develop advanced machine learning algorithms for achieving an in-depth understanding of the commonality and variability of EEG signals across subjects, sessions, and tasks.
- Published
- 2022
- Full Text
- View/download PDF