1. A brain-computer interface that evokes tactile sensations improves robotic arm control
- Author
-
Jeffrey M Weiss, Angelica J. Herrera, Elizabeth C. Tyler-Kabara, Sharlene N Flesher, Robert A. Gaunt, Michael L. Boninger, Christopher L Hughes, John E. Downey, and Jennifer L. Collinger
- Subjects
Adult ,Male ,0301 basic medicine ,medicine.medical_specialty ,Computer science ,Movement ,Interface (computing) ,Artificial Limbs ,Quadriplegia ,Somatosensory system ,Article ,03 medical and health sciences ,0302 clinical medicine ,Physical medicine and rehabilitation ,medicine ,Humans ,Tetraplegia ,Brain–computer interface ,Multidisciplinary ,Hand Strength ,business.industry ,GRASP ,Motor Cortex ,Robotics ,Somatosensory Cortex ,medicine.disease ,030104 developmental biology ,medicine.anatomical_structure ,Touch ,Brain-Computer Interfaces ,Arm ,Artificial intelligence ,business ,Robotic arm ,030217 neurology & neurosurgery ,Motor cortex - Abstract
A boost for brain–computer interfaces The finely controlled movement of our limbs requires two-way neuronal communication between the brain and the body periphery. This includes afferent information from muscles, joints, and skin, as well as visual feedback to plan, initiate, and execute motor output. In tetraplegia, this neural communication is interrupted in both directions at the level of the spinal cord. Brain–computer interfaces have been developed to produce voluntary motor output controlled by directly recording from brain activity. Flesher et al. added an afferent channel to the brain–computer interface to mimic sensory input from the skin of a hand (see the Perspective by Faisal). The improvements achieved by adding the afferent input were substantial in a battery of motor tasks tested in a human subject. Science , abd0380, this issue p. 831 ; see also abi7262, p. 791
- Published
- 2021