1. HYGRIP: Full-Stack Characterization of Neurobehavioral Signals (fNIRS, EEG, EMG, Force, and Breathing) During a Bimanual Grip Force Control Task
- Author
-
Pablo Ortega, Tong Zhao, A. Aldo Faisal, Daly, I, and Engineering and Physical Sciences Research Council
- Subjects
medicine.medical_specialty ,near-infrared spectroscopy ,Computer science ,1702 Cognitive Sciences ,non-invasive ,data set ,non-invasice ,Electroencephalography ,lcsh:RC321-571 ,Task (project management) ,Physical medicine and rehabilitation ,Stack (abstract data type) ,continuous decoding ,Data Report ,medicine ,sensor-fusion ,lcsh:Neurosciences. Biological psychiatry. Neuropsychiatry ,power-grip ,Brain–computer interface ,medicine.diagnostic_test ,General Neuroscience ,brain-computer interface ,Non invasive ,1701 Psychology ,Brain-Computer Interfaces ,Breathing ,Grip force ,1109 Neurosciences ,electroencephalography ,Neuroscience ,Dataset - Abstract
Brain-computer interfaces (BCIs) have achieved important milestones in recent years, but the major number of breakthroughs in the continuous control of movement have focused on invasive neural interfaces with motor cortex or peripheral nerves. In contrast, non-invasive BCIs have made primarily progress in continuous decoding using event-related data, while the direct decoding of movement command or muscle force from brain data is an open challenge. Multi-modal signals from human cortex, obtained from mobile brain imaging that combines oxygenation and electrical neuronal signals, do not yet exploit their full potential due to the lack of computational techniques able to fuse and decode these hybrid measurements. To stimulate the research community and machine learning techniques closer to the state-of-the-art in artificial intelligence we release herewith a holistic data set of hybrid non-invasive measures for continuous force decoding: the Hybrid Dynamic Grip (HYGRIP) data set. We aim to provide a complete data set, that comprises the target force for the left/right hand, cortical brain signals in form of electroencephalography (EEG) with high temporal resolution and functional near-infrared spectroscopy (fNIRS) that captures in higher spatial resolution a BOLD-like cortical brain response, as well as the muscle activity (EMG) of the grip muscles, the force generated at the grip sensor (force), as well as confounding noise sources, such as breathing and eye movement activity during the task. In total, 14 right-handed subjects performed a uni-manual dynamic grip force task within $25-50\%$ of each hand's maximum voluntary contraction. HYGRIP is intended as a benchmark with two open challenges and research questions for grip-force decoding. First, the exploitation and fusion of data from brain signals spanning very different time-scales, as EEG changes about three orders of magnitude faster than fNIRS. Second, the decoding of whole-brain signals associated with the use of each hand and the extent to which models share features for each hand, or conversely, are different for each hand. Our companion code makes the exploitation of the data readily available and accessible to researchers in the BCI, neurophysiology and machine learning communities. Thus, HYGRIP can serve as a test-bed for the development of BCI decoding algorithms and responses fusing multimodal brain signals. The resulting methods will help understand limitations and opportunities to benefit people in health and indirectly inform similar methods answering the particular needs of people in disease.
- Published
- 2020