61 results on '"Matthew S. Fifer"'
Search Results
2. A click-based electrocorticographic brain-computer interface enables long-term high-performance switch scan spelling
- Author
-
Daniel N. Candrea, Samyak Shah, Shiyu Luo, Miguel Angrick, Qinwan Rabbani, Christopher Coogan, Griffin W. Milsap, Kevin C. Nathan, Brock A. Wester, William S. Anderson, Kathryn R. Rosenblatt, Alpa Uchil, Lora Clawson, Nicholas J. Maragakis, Mariska J. Vansteensel, Francesco V. Tenore, Nicolas F. Ramsey, Matthew S. Fifer, and Nathan E. Crone
- Subjects
Medicine - Abstract
Abstract Background Brain-computer interfaces (BCIs) can restore communication for movement- and/or speech-impaired individuals by enabling neural control of computer typing applications. Single command click detectors provide a basic yet highly functional capability. Methods We sought to test the performance and long-term stability of click decoding using a chronically implanted high density electrocorticographic (ECoG) BCI with coverage of the sensorimotor cortex in a human clinical trial participant (ClinicalTrials.gov, NCT03567213) with amyotrophic lateral sclerosis. We trained the participant’s click detector using a small amount of training data (
- Published
- 2024
- Full Text
- View/download PDF
3. Online speech synthesis using a chronically implanted brain–computer interface in an individual with ALS
- Author
-
Miguel Angrick, Shiyu Luo, Qinwan Rabbani, Daniel N. Candrea, Samyak Shah, Griffin W. Milsap, William S. Anderson, Chad R. Gordon, Kathryn R. Rosenblatt, Lora Clawson, Donna C. Tippett, Nicholas Maragakis, Francesco V. Tenore, Matthew S. Fifer, Hynek Hermansky, Nick F. Ramsey, and Nathan E. Crone
- Subjects
Medicine ,Science - Abstract
Abstract Brain–computer interfaces (BCIs) that reconstruct and synthesize speech using brain activity recorded with intracranial electrodes may pave the way toward novel communication interfaces for people who have lost their ability to speak, or who are at high risk of losing this ability, due to neurological disorders. Here, we report online synthesis of intelligible words using a chronically implanted brain-computer interface (BCI) in a man with impaired articulation due to ALS, participating in a clinical trial (ClinicalTrials.gov, NCT03567213) exploring different strategies for BCI communication. The 3-stage approach reported here relies on recurrent neural networks to identify, decode and synthesize speech from electrocorticographic (ECoG) signals acquired across motor, premotor and somatosensory cortices. We demonstrate a reliable BCI that synthesizes commands freely chosen and spoken by the participant from a vocabulary of 6 keywords previously used for decoding commands to control a communication board. Evaluation of the intelligibility of the synthesized speech indicates that 80% of the words can be correctly recognized by human listeners. Our results show that a speech-impaired individual with ALS can use a chronically implanted BCI to reliably produce synthesized words while preserving the participant’s voice profile, and provide further evidence for the stability of ECoG for speech-based BCIs.
- Published
- 2024
- Full Text
- View/download PDF
4. Stable Decoding from a Speech BCI Enables Control for an Individual with ALS without Recalibration for 3 Months
- Author
-
Shiyu Luo, Miguel Angrick, Christopher Coogan, Daniel N. Candrea, Kimberley Wyse‐Sookoo, Samyak Shah, Qinwan Rabbani, Griffin W. Milsap, Alexander R. Weiss, William S. Anderson, Donna C. Tippett, Nicholas J. Maragakis, Lora L. Clawson, Mariska J. Vansteensel, Brock A. Wester, Francesco V. Tenore, Hynek Hermansky, Matthew S. Fifer, Nick F. Ramsey, and Nathan E. Crone
- Subjects
amyotrophic lateral sclerosis (ALS) ,brain‐computer interfaces ,neural decoding ,speech brain‐computer interface (BCI) ,Science - Abstract
Abstract Brain‐computer interfaces (BCIs) can be used to control assistive devices by patients with neurological disorders like amyotrophic lateral sclerosis (ALS) that limit speech and movement. For assistive control, it is desirable for BCI systems to be accurate and reliable, preferably with minimal setup time. In this study, a participant with severe dysarthria due to ALS operates computer applications with six intuitive speech commands via a chronic electrocorticographic (ECoG) implant over the ventral sensorimotor cortex. Speech commands are accurately detected and decoded (median accuracy: 90.59%) throughout a 3‐month study period without model retraining or recalibration. Use of the BCI does not require exogenous timing cues, enabling the participant to issue self‐paced commands at will. These results demonstrate that a chronically implanted ECoG‐based speech BCI can reliably control assistive devices over long time periods with only initial model training and calibration, supporting the feasibility of unassisted home use.
- Published
- 2023
- Full Text
- View/download PDF
5. Characteristics and stability of sensorimotor activity driven by isolated-muscle group activation in a human with tetraplegia
- Author
-
Robert W. Nickl, Manuel A. Anaya, Tessy M. Thomas, Matthew S. Fifer, Daniel N. Candrea, David P. McMullen, Margaret C. Thompson, Luke E. Osborn, William S. Anderson, Brock A. Wester, Francesco V. Tenore, Nathan E. Crone, Gabriela L. Cantarero, and Pablo A. Celnik
- Subjects
Medicine ,Science - Abstract
Abstract Understanding the cortical representations of movements and their stability can shed light on improved brain-machine interface (BMI) approaches to decode these representations without frequent recalibration. Here, we characterize the spatial organization (somatotopy) and stability of the bilateral sensorimotor map of forearm muscles in an incomplete-high spinal-cord injury study participant implanted bilaterally in the primary motor and sensory cortices with Utah microelectrode arrays (MEAs). We built representation maps by recording bilateral multiunit activity (MUA) and surface electromyography (EMG) as the participant executed voluntary contractions of the extensor carpi radialis (ECR), and attempted motions in the flexor carpi radialis (FCR), which was paralytic. To assess stability, we repeatedly mapped and compared left- and right-wrist-extensor-related activity throughout several sessions, comparing somatotopy of active electrodes, as well as neural signals both at the within-electrode (multiunit) and cross-electrode (network) levels. Wrist motions showed significant activation in motor and sensory cortical electrodes. Within electrodes, firing strength stability diminished as the time increased between consecutive measurements (hours within a session, or days across sessions), with higher stability observed in sensory cortex than in motor, and in the contralateral hemisphere than in the ipsilateral. However, we observed no differences at network level, and no evidence of decoding instabilities for wrist EMG, either across timespans of hours or days, or across recording area. While map stability differs between brain area and hemisphere at multiunit/electrode level, these differences are nullified at ensemble level.
- Published
- 2022
- Full Text
- View/download PDF
6. Perceived timing of cutaneous vibration and intracortical microstimulation of human somatosensory cortex
- Author
-
Breanne Christie, Luke E. Osborn, David P. McMullen, Ambarish S. Pawar, Tessy M. Thomas, Sliman J. Bensmaia, Pablo A. Celnik, Matthew S. Fifer, and Francesco V. Tenore
- Subjects
Electrical stimulation ,Brain-computer interface ,Somatosensation ,Touch ,Latency ,Neurosciences. Biological psychiatry. Neuropsychiatry ,RC321-571 - Abstract
Background: Intracortical microstimulation (ICMS) of somatosensory cortex can partially restore the sense of touch. Though ICMS bypasses much of the neuraxis, prior studies have found that conscious detection of touch elicited by ICMS lags behind the detection of cutaneous vibration. These findings may have been influenced by mismatched stimulus intensities, which can impact temporal perception. Objective: Evaluate the relative latency at which intensity-matched vibration and ICMS are perceived by a human participant. Methods: One person implanted with microelectrode arrays in somatosensory cortex performed reaction time and temporal order judgment (TOJ) tasks. To measure reaction time, the participant reported when he perceived vibration or ICMS. In the TOJ task, vibration and ICMS were sequentially presented and the participant reported which stimulus occurred first. To verify that the participant could distinguish between stimuli, he also performed a modality discrimination task, in which he indicated if he felt vibration, ICMS, or both. Results: When vibration was matched in perceived intensity to high-amplitude ICMS, vibration was perceived, on average, 48 ms faster than ICMS. However, in the TOJ task, both sensations arose at comparable latencies, with points of subjective simultaneity not significantly different from zero. The participant could discriminate between tactile modalities above chance level but was more inclined to report feeling vibration than ICMS. Conclusions: The latencies of ICMS-evoked percepts are slower than their mechanical counterparts. However, differences in latencies are small, particularly when stimuli are matched for intensity, implying that ICMS-based somatosensory feedback is rapid enough to be effective in neuroprosthetic applications.
- Published
- 2022
- Full Text
- View/download PDF
7. Shared Control of Bimanual Robotic Limbs With a Brain-Machine Interface for Self-Feeding
- Author
-
David A. Handelman, Luke E. Osborn, Tessy M. Thomas, Andrew R. Badger, Margaret Thompson, Robert W. Nickl, Manuel A. Anaya, Jared M. Wormley, Gabriela L. Cantarero, David McMullen, Nathan E. Crone, Brock Wester, Pablo A. Celnik, Matthew S. Fifer, and Francesco V. Tenore
- Subjects
human machine teaming ,brain computer interface (BCI) ,bimanual control ,robotic shared control ,activities of daily living (ADL) ,Neurosciences. Biological psychiatry. Neuropsychiatry ,RC321-571 - Abstract
Advances in intelligent robotic systems and brain-machine interfaces (BMI) have helped restore functionality and independence to individuals living with sensorimotor deficits; however, tasks requiring bimanual coordination and fine manipulation continue to remain unsolved given the technical complexity of controlling multiple degrees of freedom (DOF) across multiple limbs in a coordinated way through a user input. To address this challenge, we implemented a collaborative shared control strategy to manipulate and coordinate two Modular Prosthetic Limbs (MPL) for performing a bimanual self-feeding task. A human participant with microelectrode arrays in sensorimotor brain regions provided commands to both MPLs to perform the self-feeding task, which included bimanual cutting. Motor commands were decoded from bilateral neural signals to control up to two DOFs on each MPL at a time. The shared control strategy enabled the participant to map his four-DOF control inputs, two per hand, to as many as 12 DOFs for specifying robot end effector position and orientation. Using neurally-driven shared control, the participant successfully and simultaneously controlled movements of both robotic limbs to cut and eat food in a complex bimanual self-feeding task. This demonstration of bimanual robotic system control via a BMI in collaboration with intelligent robot behavior has major implications for restoring complex movement behaviors for those living with sensorimotor deficits.
- Published
- 2022
- Full Text
- View/download PDF
8. Cortical Response to Expectation of Tactile Stimulation from External Anthropomorphic and Non-Anthropomorphic Systems.
- Author
-
Luke E. Osborn, Breanne P. Christie, Adam C. G. Crego, Dayann D'almeida, David P. McMullen, Robert W. Nickl, Ambarish S. Pawar, Jeremy D. Brown, Brock A. Wester, Chaz Firestone, Pablo A. Celnik, Matthew S. Fifer, and Francesco V. Tenore
- Published
- 2023
- Full Text
- View/download PDF
9. Simulation and analysis of neuromorphic tactile data for object interaction speed detection.
- Author
-
Christophe J. Brown, Harrison H. Nguyen, Margaret C. Thompson, Justin M. Joyce, Erik C. Johnson, Matthew S. Fifer, and Luke E. Osborn
- Published
- 2021
- Full Text
- View/download PDF
10. Intracortical microstimulation of somatosensory cortex generates evoked responses in motor cortex.
- Author
-
Luke E. Osborn, David P. McMullen, Breanne P. Christie, Pawel Kudela, Tessy M. Thomas, Margaret C. Thompson, Robert W. Nickl, Manuel Alejandro Anaya, Sahana Srihari, Nathan E. Crone, Brock A. Wester, Pablo A. Celnik, Gabriela L. Cantarero, Francesco V. Tenore, and Matthew S. Fifer
- Published
- 2021
- Full Text
- View/download PDF
11. Intracortical microstimulation of somatosensory cortex enables object identification through perceived sensations.
- Author
-
Luke E. Osborn, Breanne P. Christie, David P. McMullen, Robert W. Nickl, Margaret C. Thompson, Ambarish S. Pawar, Tessy M. Thomas, Manuel Alejandro Anaya, Nathan E. Crone, Brock A. Wester, Sliman J. Bensmaia, Pablo A. Celnik, Gabriela L. Cantarero, Francesco V. Tenore, and Matthew S. Fifer
- Published
- 2021
- Full Text
- View/download PDF
12. Stable Electromyographic Sequence Prediction During Movement Transitions using Temporal Convolutional Networks.
- Author
-
Joseph L. Betthauser, John T. Krall, Rahul R. Kaliki, Matthew S. Fifer, and Nitish V. Thakor
- Published
- 2019
- Full Text
- View/download PDF
13. Stable Responsive EMG Sequence Prediction and Adaptive Reinforcement With Temporal Convolutional Networks.
- Author
-
Joseph L. Betthauser, John T. Krall, Shain G. Bannowsky, György Lévay, Rahul R. Kaliki, Matthew S. Fifer, and Nitish V. Thakor
- Published
- 2020
- Full Text
- View/download PDF
14. Targeted transcutaneous electrical nerve stimulation for phantom limb sensory feedback.
- Author
-
Luke Osborn, Matthew S. Fifer, Courtney Moran, Joseph Betthauser, Robert S. Armiger, Rahul R. Kaliki, and Nitish V. Thakor
- Published
- 2017
- Full Text
- View/download PDF
15. Brain-Machine Interface Development for Finger Movement Control.
- Author
-
Tessy M. Lal, Guy Hotson, Matthew S. Fifer, David P. McMullen, Matthew S. Johannes, Kapil D. Katyal, Matthew P. Para, Robert S. Armiger, William S. Anderson, Nitish V. Thakor, Brock A. Wester, and Nathan E. Crone
- Published
- 2017
- Full Text
- View/download PDF
16. Eigenvector centrality reveals the time course of task-specific electrode connectivity in human ECoG.
- Author
-
Geoffrey I. Newman, Matthew S. Fifer, Heather L. Benz, Nathan E. Crone, and Nitish V. Thakor
- Published
- 2015
- Full Text
- View/download PDF
17. Semi-autonomous Hybrid Brain-Machine Interface.
- Author
-
David P. McMullen, Matthew S. Fifer, Brock A. Wester, Guy Hotson, Kapil D. Katyal, Matthew S. Johannes, Timothy G. McGee, Andrew Harris, Alan D. Ravitz, Michael P. McLoughlin, William S. Anderson, Nitish V. Thakor, and Nathan E. Crone
- Published
- 2015
- Full Text
- View/download PDF
18. Cortical subnetwork dynamics during human language tasks.
- Author
-
Maxwell J. Collard, Matthew S. Fifer, Heather L. Benz, David P. McMullen, Yujing Wang, Griffin Milsap, Anna Korzeniewska, and Nathan E. Crone
- Published
- 2016
- Full Text
- View/download PDF
19. Neuroprosthetic limb control with electrocorticography: Approaches and challenges.
- Author
-
Nitish V. Thakor, Matthew S. Fifer, Guy Hotson, Heather L. Benz, Geoffrey I. Newman, Griffin W. Milsap, and Nathan E. Crone
- Published
- 2014
- Full Text
- View/download PDF
20. Electrocorticographic decoding of ipsilateral reach in the setting of contralateral arm weakness from a cortical lesion.
- Author
-
Guy Hotson, Matthew S. Fifer, Soumyadipta Acharya, William S. Anderson, Nitish V. Thakor, and Nathan E. Crone
- Published
- 2012
- Full Text
- View/download PDF
21. Intracortical Somatosensory Stimulation to Elicit Fingertip Sensations in an Individual With Spinal Cord Injury
- Author
-
Matthew S. Fifer, David P. McMullen, Luke E. Osborn, Tessy M. Thomas, Breanne Christie, Robert W. Nickl, Daniel N. Candrea, Eric A. Pohlmeyer, Margaret C. Thompson, Manuel A. Anaya, Wouter Schellekens, Nick F. Ramsey, Sliman J. Bensmaia, William S. Anderson, Brock A. Wester, Nathan E. Crone, Pablo A. Celnik, Gabriela L. Cantarero, and Francesco V. Tenore
- Subjects
Touch ,Humans ,Somatosensory Cortex ,Neurology (clinical) ,Hand ,Electric Stimulation ,Spinal Cord Injuries - Abstract
Background and ObjectivesThe restoration of touch to fingers and fingertips is critical to achieving dexterous neuroprosthetic control for individuals with sensorimotor dysfunction. However, localized fingertip sensations have not been evoked via intracortical microstimulation (ICMS).MethodsUsing a novel intraoperative mapping approach, we implanted electrode arrays in the finger areas of left and right somatosensory cortex and delivered ICMS over a 2-year period in a human participant with spinal cord injury.ResultsStimulation evoked tactile sensations in 8 fingers, including fingertips, spanning both hands. Evoked percepts followed expected somatotopic arrangements. The subject was able to reliably identify up to 7 finger-specific sites spanning both hands in a finger discrimination task. The size of the evoked percepts was on average 33% larger than a finger pad, as assessed via manual markings of a hand image. The size of the evoked percepts increased modestly with increased stimulation intensity, growing 21% as pulse amplitude increased from 20 to 80 µA. Detection thresholds were estimated on a subset of electrodes, with estimates of 9.2 to 35 µA observed, roughly consistent with prior studies.DiscussionThese results suggest that ICMS can enable the delivery of consistent and localized fingertip sensations during object manipulation by neuroprostheses for individuals with somatosensory deficits.ClinicalTrials.gov IdentifierNCT03161067.
- Published
- 2021
- Full Text
- View/download PDF
22. Asynchronous decoding of grasp aperture from human ECoG during a reach-to-grasp task.
- Author
-
Matthew S. Fifer, Mohsen Mollazadeh, Soumyadipta Acharya, Nitish V. Thakor, and Nathan E. Crone
- Published
- 2011
- Full Text
- View/download PDF
23. Stable Responsive EMG Sequence Prediction and Adaptive Reinforcement With Temporal Convolutional Networks
- Author
-
Gyorgy Levay, Nitish V. Thakor, Joseph L. Betthauser, Rahul R. Kaliki, Matthew S. Fifer, Shain G. Bannowsky, and John T. Krall
- Subjects
Computer science ,Movement ,medicine.medical_treatment ,0206 medical engineering ,Biomedical Engineering ,Stability (learning theory) ,Artificial Limbs ,02 engineering and technology ,Electromyography ,Prosthesis ,Amputees ,medicine ,Humans ,Leverage (statistics) ,Reinforcement learning ,medicine.diagnostic_test ,Artificial neural network ,business.industry ,Reproducibility of Results ,Pattern recognition ,Hand ,020601 biomedical engineering ,Recurrent neural network ,Control system ,Pattern recognition (psychology) ,Artificial intelligence ,business - Abstract
Prediction of movement intentions from electromyographic (EMG) signals is typically performed with a pattern recognition approach, wherein a short dataframe of raw EMG is compressed into an instantaneous feature-encoding that is meaningful for classification. However, EMG signals are time-varying, implying that a frame-wise approach may not sufficiently incorporate temporal context into predictions, leading to erratic and unstable prediction behavior. Objective: We demonstrate that sequential prediction models and, specifically, temporal convolutional networks are able to leverage useful temporal information from EMG to achieve superior predictive performance. Methods: We compare this approach to other sequential and frame-wise models predicting 3 simultaneous hand and wrist degrees-of-freedom from 2 amputee and 13 non-amputee human subjects in a minimally constrained experiment. We also compare these models on the publicly available Ninapro and CapgMyo amputee and non-amputee datasets. Results: Temporal convolutional networks yield predictions that are more accurate and stable $(p than frame-wise models, especially during inter-class transitions, with an average response delay of 4.6 ms $(p and simpler feature-encoding. Their performance can be further improved with adaptive reinforcement training. Significance: Sequential models that incorporate temporal information from EMG achieve superior movement prediction performance and these models allow for novel types of interactive training. Conclusions: Addressing EMG decoding as a sequential modeling problem will lead to enhancements in the reliability, responsiveness, and movement complexity available from prosthesis control systems.
- Published
- 2020
- Full Text
- View/download PDF
24. Perceived Timing of Cutaneous Vibration and Intracortical Microstimulation of Human Somatosensory Cortex
- Author
-
Breanne Christie, Luke E. Osborn, David P. McMullen, Ambarish S. Pawar, Tessy M. Thomas, Sliman J. Bensmaia, Pablo A. Celnik, Matthew S. Fifer, and Francesco V. Tenore
- Subjects
Male ,Touch ,General Neuroscience ,Biophysics ,Humans ,Somatosensory Cortex ,Neurology (clinical) ,Microelectrodes ,Vibration ,Electric Stimulation - Abstract
BackgroundElectrically stimulating the somatosensory cortex can partially restore the sense of touch. Though this technique bypasses much of the neuroaxis, prior studies with non-human primates have found that conscious detection of touch elicited by intracortical microstimulation (ICMS) lags behind the detection of vibration applied to the skin. These findings may have been influenced by a mismatch in stimulus intensity; typically, vibration is perceived as more intense than ICMS, which can significantly impact temporal perception.ObjectiveThe goal of this study was to evaluate the relative latency at which intensity-matched vibration and ICMS are perceived in a human subject.MethodsA human participant implanted with microelectrode arrays in somatosensory cortex performed a reaction time task and a temporal order judgment (TOJ) task. In the reaction time task, the participant was presented with vibration or ICMS and verbal response times were obtained. In the TOJ task, the participant was sequentially presented with a pair of stimuli – vibration followed by ICMS or vice versa – and reported which stimulus occurred first.ResultsWhen vibration and ICMS were matched in intensity at a “high” stimulus level, the reaction time to vibration was 48 ms faster than ICMS. When both stimuli were perceived as lower in intensity, the difference in reaction time was even larger: vibration was perceived 90 ms before ICMS. However, in the TOJ task, vibratory and ICMS sensations arose at comparable latencies, with points of subjective simultaneity that were not significantly different from zero.ConclusionsBecause the perception of ICMS is slower than that of intensity-matched vibration, it may be necessary to stimulate at stronger ICMS intensities (thus decreasing reaction time) when incorporating ICMS sensory feedback into neural prostheses.
- Published
- 2021
- Full Text
- View/download PDF
25. Intracortical microstimulation of somatosensory cortex generates evoked responses in motor cortex
- Author
-
Manuel Anaya, Pablo Celnik, Matthew S. Fifer, Margaret C. Thompson, Brock A. Wester, Breanne P. Christie, Gabriela Cantarero, Pawel Kudela, Sahana Srihari, Francesco Tenore, Luke Osborn, Nathan E. Crone, Robert W. Nickl, Tessy M. Thomas, and David P. McMullen
- Subjects
business.industry ,05 social sciences ,Multielectrode array ,Tactile perception ,Stimulus (physiology) ,Somatosensory system ,050105 experimental psychology ,Neuromodulation (medicine) ,03 medical and health sciences ,0302 clinical medicine ,medicine.anatomical_structure ,Cerebral cortex ,medicine ,0501 psychology and cognitive sciences ,Evoked potential ,business ,Neuroscience ,030217 neurology & neurosurgery ,Motor cortex - Abstract
The complex nature of neural connections throughout the cerebral cortex has led to broad interest in understanding cortical functional networks of tactile perception and sensorimotor integration. Cortico-cortical evoked potentials (CCEPs) can be used as physiological markers to study and map cerebral networks in the brain. In a human participant with bi-hemispheric microelectrode array implants in sensorimotor regions of the brain, we found that intracortical microstimulation (ICMS) of the primary somatosensory cortex can lead to evoked responses in the motor cortex in the same hemisphere, indicating connectivity between these sensorimotor regions. Single ICMS pulses were not consciously perceived, but elicited a rapid evoked potential approximately 20 ms after stimulus onset. Multi-pulse ICMS trains, perceived as tactile sensations in the thumb, sustained over an approximately 33 ms period, led to a delayed evoked response roughly 80 ms after stimulus onset. This work is important not only for better understanding the functional relationship between cortical areas, specifically somatosensory and motor cortices, but also to provide insight on pathways where neuromodulation techniques could be employed for rehabilitation or mitigation of sensorimotor neurodegenerative effects.
- Published
- 2021
- Full Text
- View/download PDF
26. Targeted Transcutaneous Electrical Nerve Stimulation for Phantom Limb Sensory Feedback
- Author
-
Nitish V. Thakor, Matthew S. Fifer, Luke Osborn, Courtney W. Moran, Robert S. Armiger, Joseph L. Betthauser, and Rahul R. Kaliki
- Subjects
0301 basic medicine ,Computer science ,Phantom limb ,Sensory system ,Stimulation ,Biological neuron model ,medicine.disease ,Somatosensory system ,Transcutaneous electrical nerve stimulation ,Article ,law.invention ,03 medical and health sciences ,030104 developmental biology ,0302 clinical medicine ,medicine.anatomical_structure ,Neuromorphic engineering ,law ,medicine ,030217 neurology & neurosurgery ,Reinnervation ,Biomedical engineering - Abstract
In this work, we investigated the use of noninvasive, targeted transcutaneous electrical nerve stimulation (TENS) of peripheral nerves to provide sensory feedback to two amputees, one with targeted sensory reinnervation (TSR) and one without TSR. A major step in developing a closed-loop prosthesis is providing the sense of touch back to the amputee user. We investigated the effect of targeted nerve stimulation amplitude, pulse width, and frequency on stimulation perception. We discovered that both subjects were able to reliably detect stimulation patterns with pulses less than 1 ms. We utilized the psychophysical results to produce a subject specific stimulation pattern using a leaky integrate and fire (LIF) neuron model from force sensors on a prosthetic hand during a grasping task. For the first time, we show that TENS is able to provide graded sensory feedback at multiple sites in both TSR and non-TSR amputees while using behavioral results to tune a neuromorphic stimulation pattern driven by a force sensor output from a prosthetic hand.
- Published
- 2021
27. Monitoring at-home prosthesis control improvements through real-time data logging
- Author
-
Luke E Osborn, Courtney W Moran, Lauren D Dodd, Erin E Sutton, Nicolas Norena Acosta, Jared M Wormley, Connor O Pyles, Kelles D Gordge, Michelle J Nordstrom, Josef A Butkus, Jonathan A Forsberg, Paul F Pasquina, Matthew S Fifer, and Robert S Armiger
- Subjects
Cellular and Molecular Neuroscience ,Electromyography ,Quality of Life ,Biomedical Engineering ,Humans ,Artificial Limbs ,Prosthesis Design ,Amputation, Surgical - Abstract
Objective. Validating the ability for advanced prostheses to improve function beyond the laboratory remains a critical step in enabling long-term benefits for prosthetic limb users. Approach. A nine week take-home case study was completed with a single participant with upper limb amputation and osseointegration to better understand how an advanced prosthesis is used during daily activities. The participant was already an expert prosthesis user and used the Modular Prosthetic Limb (MPL) at home during the study. The MPL was controlled using wireless electromyography (EMG) pattern recognition-based movement decoding. Clinical assessments were performed before and after the take-home portion of the study. Data was recorded using an onboard data log in order to measure daily prosthesis usage, sensor data, and EMG data. Main results. The participant’s continuous prosthesis usage steadily increased (p= 0.04, max = 5.5 h) over time and over 30% of the total time was spent actively controlling the prosthesis. The duration of prosthesis usage after each pattern recognition training session also increased over time (p = 0.04), resulting in up to 5.4 h of usage before retraining the movement decoding algorithm. Pattern recognition control accuracy improved (1.2% per week, p < 0.001) with a maximum number of ten classes trained at once and the transitions between different degrees of freedom increased as the study progressed, indicating smooth and efficient control of the advanced prosthesis. Variability of decoding accuracy also decreased with prosthesis usage (p < 0.001) and 30% of the time was spent performing a prosthesis movement. During clinical evaluations, Box and Blocks and the Assessment of the Capacity for Myoelectric Control scores increased by 43% and 6.2%, respectively, demonstrating prosthesis functionality and the NASA Task Load Index scores decreased, on average, by 25% across assessments, indicating reduced cognitive workload while using the MPL, over the nine week study. Significance. In this case study, we demonstrate that an onboard system to monitor prosthesis usage enables better understanding of how prostheses are incorporated into daily life. That knowledge can support the long-term goal of completely restoring independence and quality of life to individuals living with upper limb amputation.
- Published
- 2022
- Full Text
- View/download PDF
28. Extended home use of an advanced osseointegrated prosthetic arm improves function, performance, and control efficiency
- Author
-
Matthew S. Johannes, Josef Butkus, Erin E. Sutton, Adam B. Cohen, Courtney W. Moran, Michelle Nordstrom, Luke Osborn, Robert S. Armiger, Christopher Dohopolski, Matthew S. Fifer, Brock A. Wester, Paul F. Pasquina, Wormley Jared M, and Albert Chi
- Subjects
medicine.medical_specialty ,Computer science ,medicine.medical_treatment ,0206 medical engineering ,Biomedical Engineering ,Artificial Limbs ,02 engineering and technology ,Electromyography ,Prosthesis Design ,Prosthesis ,03 medical and health sciences ,Cellular and Molecular Neuroscience ,0302 clinical medicine ,Physical medicine and rehabilitation ,Osseointegration ,medicine ,Humans ,Rehabilitation ,medicine.diagnostic_test ,Work (physics) ,Reproducibility of Results ,Workload ,020601 biomedical engineering ,Amputation ,Pattern recognition (psychology) ,Arm ,Implant ,030217 neurology & neurosurgery - Abstract
Objective.Full restoration of arm function using a prosthesis remains a grand challenge; however, advances in robotic hardware, surgical interventions, and machine learning are bringing seamless human-machine interfacing closer to reality.Approach.Through extensive data logging over 1 year, we monitored at-home use of the dexterous Modular Prosthetic Limb controlled through pattern recognition of electromyography (EMG) by an individual with a transhumeral amputation, targeted muscle reinnervation, and osseointegration (OI).Main results.Throughout the study, continuous prosthesis usage increased (1% per week,p< 0.001) and functional metrics improved up to 26% on control assessments and 76% on perceived workload evaluations. We observed increases in torque loading on the OI implant (up to 12.5% every month,p< 0.001) and prosthesis control performance (0.5% every month,p< 0.005), indicating enhanced user integration, acceptance, and proficiency. More importantly, the EMG signal magnitude necessary for prosthesis control decreased, up to 34.7% (p< 0.001), over time without degrading performance, demonstrating improved control efficiency with a machine learning-based myoelectric pattern recognition algorithm. The participant controlled the prosthesis up to one month without updating the pattern recognition algorithm. The participant customized prosthesis movements to perform specific tasks, such as individual finger control for piano playing and hand gestures for communication, which likely contributed to continued usage.Significance.This work demonstrates, in a single participant, the functional benefit of unconstrained use of a highly anthropomorphic prosthetic limb over an extended period. While hurdles remain for widespread use, including device reliability, results replication, and technical maturity beyond a prototype, this study offers insight as an example of the impact of advanced prosthesis technology for rehabilitation outside the laboratory.
- Published
- 2020
29. Novel intraoperative online functional mapping of somatosensory finger representations for targeted stimulating electrode placement: technical note
- Author
-
Matthew S. Fifer, Sliman J. Bensmaia, Gabriela Cantarero, Wouter Schellekens, Adam B. Cohen, Teresa Wojtasiewicz, Nathan E. Crone, Nick F. Ramsey, Brock A. Wester, William S. Anderson, Chad R. Gordon, Tessy M. Thomas, David P. McMullen, Eric A. Pohlmeyer, Francesco Tenore, Luke Osborn, Christopher Coogan, Adam Schiavi, Daniel N. Candrea, Robert W. Nickl, and Pablo Celnik
- Subjects
medicine.diagnostic_test ,business.industry ,Motor control ,Sensory system ,General Medicine ,Multielectrode array ,Somatosensory system ,03 medical and health sciences ,Functional mapping ,0302 clinical medicine ,030220 oncology & carcinogenesis ,Medicine ,business ,Electrode placement ,Electrocorticography ,030217 neurology & neurosurgery ,Biomedical engineering ,Brain–computer interface - Abstract
Defining eloquent cortex intraoperatively, traditionally performed by neurosurgeons to preserve patient function, can now help target electrode implantation for restoring function. Brain-machine interfaces (BMIs) have the potential to restore upper-limb motor control to paralyzed patients but require accurate placement of recording and stimulating electrodes to enable functional control of a prosthetic limb. Beyond motor decoding from recording arrays, precise placement of stimulating electrodes in cortical areas associated with finger and fingertip sensations allows for the delivery of sensory feedback that could improve dexterous control of prosthetic hands. In this study, the authors demonstrated the use of a novel intraoperative online functional mapping (OFM) technique with high-density electrocorticography to localize finger representations in human primary somatosensory cortex. In conjunction with traditional pre- and intraoperative targeting approaches, this technique enabled accurate implantation of stimulating microelectrodes, which was confirmed by postimplantation intracortical stimulation of finger and fingertip sensations. This work demonstrates the utility of intraoperative OFM and will inform future studies of closed-loop BMIs in humans.
- Published
- 2020
30. Novel intraoperative online functional mapping of somatosensory finger representations for targeted stimulating electrode placement
- Author
-
Pablo Celnik, Wouter Schellekens, Adam B. Cohen, Tessy M. Thomas, Gabriela Cantarero, David P. McMullen, Francesco Tenore, Eric A. Pohlmeyer, Matthew S. Fifer, Nick F. Ramsey, Luke Osborn, Teresa Wojtasiewicz, Brock A. Wester, Sliman J. Bensmaia, Adam Schiavi, Christopher Coogan, Robert W. Nickl, Nathan E. Crone, Daniel N. Candrea, Chad R. Gordon, and William S. Anderson
- Subjects
Functional mapping ,Eloquent cortex ,medicine.diagnostic_test ,Computer science ,medicine ,Motor control ,Sensory system ,Somatosensory system ,Electrode placement ,Electrocorticography ,Brain–computer interface ,Biomedical engineering - Abstract
Defining eloquent cortex intraoperatively, traditionally performed by neurosurgeons to preserve patient function, can now help target electrode implantation for restoring function. Brain-machine interfaces (BMIs) have the potential to restore upper-limb motor control to paralyzed patients but require accurate placement of recording and stimulating electrodes to enable functional control of a prosthetic limb. Beyond motor decoding from recording arrays, precise placement of stimulating electrodes in cortical areas associated with finger and fingertip sensations allows for the delivery of sensory feedback that could improve dexterous control of prosthetic hands. In our study, we demonstrated the use of a novel intraoperative online functional mapping (OFM) technique with high-density electrocorticography (ECoG) to localize finger representations in human primary somatosensory cortex. In conjunction with traditional pre- and intraoperative targeting approaches, this technique enabled accurate implantation of stimulating microelectrodes, which was confirmed by post-implantation intra-cortical stimulation of finger and fingertip sensations. This work demonstrates the utility of intraoperative OFM and will inform future studies of closed-loop BMIs in humans.
- Published
- 2020
- Full Text
- View/download PDF
31. 'Characteristics and stability of sensorimotor activity driven by isolated-muscle group activation in a human with tetraplegia'
- Author
-
Robert W. Nickl, Manuel A. Anaya, Tessy M. Thomas, Matthew S. Fifer, Daniel N. Candrea, David P McMullen, Margaret C. Thompson, Luke E. Osborn, William S. Anderson, Brock A. Wester, Francesco V. Tenore, Nathan E. Crone, Gabriela L. Cantarero, and Pablo A. Celnik
- Subjects
Spatial stability ,Multidisciplinary ,Electromyography ,Movement ,Sensory system ,Biology ,Quadriplegia ,medicine.disease ,Stability (probability) ,Forearm ,Single muscle ,medicine ,Humans ,Body region ,Muscle, Skeletal ,Muscle group ,Neuroscience ,Tetraplegia ,Brain–computer interface - Abstract
The topography and temporal stability of movement representations in sensorimotor cortex underlie the quality and durability of neural decoders for brain machine interface (BMI) technology. While single- and multi-unit activity (SUA and MUA) in sensorimotor cortex has been used to characterize the layout of the sensorimotor map, quantifying its stability has not been done outside of injury or targeted interventions. Here we aimed to characterize 1) the bilateral sensorimotor body map associated to isolated muscle group contractions and 2) the stability of multiunit firing responses for a single muscle (the extensor carpi radialis, ECR) over short (minutes) and long (days) time intervals. We concurrently recorded surface electromyograms (EMG) and MUA in a participant with incomplete high-spinal-cord injury as he executed (or attempted to execute) different metronome-paced, isolated muscle group contractions. Furthermore, for 8 recording sessions over 2 months, we characterized the sensorimotor map associated to ECR motions both within and across sessions. For each measurement period, we compared the stability of somatotopy (defined by the number of the channels on which a response was consistently detected) and firing pattern stability for each responsive channel. Stability was calculated for each channel in peri-EMG or peri-cue windows using both mean MUA firing rates and the full time-varying responses (i.e., MUA “shape”). First, we found that cortical representations of isolated group muscle contractions overlapped, even for muscles from disparate body regions such as facial and distal leg muscles; this was the case for both intact and de-efferented muscles, in both motor and sensory channels. Second, the spatial stability of somatotopy significantly changed over the course of both minutes and days, with the consistency between sessions decreasing across longer bouts of time. Firing pattern stabilities showed distinct profiles; mean MUA firing rates became less stable over time whereas MUA shape remained consistent. Interestingly, sensory channels were overall more consistent than motor channels in terms of spatial stability, mean MUA firing rates, and MUA shape. Our findings suggest that the encoding of muscle-driven specific activity in sensorimotor cortex at the level of MUA is redundant and widespread with complex spatial and temporal characteristics. These findings extend our understanding of how sensorimotor cortex represents movements, which could be leveraged for the design of non-traditional BMI approaches.
- Published
- 2020
- Full Text
- View/download PDF
32. Simultaneous classification of bilateral hand gestures using bilateral microelectrode recordings in a tetraplegic patient
- Author
-
William S. Anderson, Brock A. Wester, Matthew S. Fifer, Gabriela Cantarero, Robert W. Nickl, Margaret C. Thompson, Pablo Celnik, Daniel N. Candrea, Francesco Tenore, Luke Osborn, Nathan E. Crone, Manuel Anaya, Tessy M. Thomas, Eric A. Pohlmeyer, and David P. McMullen
- Subjects
Computer science ,Movement (music) ,Speech recognition ,Interface (computing) ,Linear discriminant analysis ,Robotic arm ,Gesture ,Task (project management) - Abstract
Most daily tasks require simultaneous control of both hands. Here we demonstrate simultaneous classification of gestures in both hands using multi-unit activity recorded from bilateral motor and somatosensory cortices of a tetraplegic participant. Attempted gestures were classified using hierarchical linear discriminant models trained separately for each hand. In an online experiment, gestures were continuously classified and used to control two robotic arms in a center-out movement task. Bimanual trials that required keeping one hand still resulted in the best performance (70.6%), followed by symmetric movement trials (50%) and asymmetric movement trials (22.7%). Our results indicate that gestures can be simultaneously decoded in both hands using two independently trained hand models concurrently, but online control using this approach becomes more difficult with increased complexity of bimanual gesture combinations. This study demonstrates the potential for restoring simultaneous control of both hands using a bilateral intracortical brain-machine interface.
- Published
- 2020
- Full Text
- View/download PDF
33. Intracortical Microstimulation Elicits Human Fingertip Sensations
- Author
-
Wouter Schellekens, Matthew S. Fifer, Tessy M. Thomas, Sliman J. Bensmaia, Eric A. Pohlmeyer, Gabriela Cantarero, Brock A. Wester, Nathan E. Crone, Margaret C. Thompson, Manuel Anaya, Pablo Celnik, William S. Anderson, Daniel N. Candrea, Francesco Tenore, Luke Osborn, Robert W. Nickl, David P. McMullen, and Nick F. Ramsey
- Subjects
body regions ,Cutaneous sensation ,medicine.medical_specialty ,Intracortical microstimulation ,Physical medicine and rehabilitation ,genetic structures ,integumentary system ,business.industry ,medicine ,Percept ,business - Abstract
The restoration of cutaneous sensation to fingers and fingertips is critical to achieving dexterous prosthesis control for individuals with sensorimotor dysfunction. However, localized and reproducible fingertip sensations in humans have not been reported via intracortical microstimulation (ICMS) in humans. Here, we show that ICMS in a human participant was capable of eliciting percepts in 7 fingers spanning both hands, including 6 fingertip regions (i.e., 3 on each hand). Median percept size was estimated to include 1.40 finger or palmar segments (e.g., one segment being a fingertip or the section of upper palm below a finger). This was corroborated with a more sensitive manual marking technique where median percept size corresponded to roughly 120% of a fingertip segment. Percepts showed high intra-day consistency, including high performance (99%) on a blinded finger discrimination task. Across days, there was more variability in percepts, with 75.8% of trials containing the modal finger or palm region for the stimulated electrode. These results suggest that ICMS can enable the delivery of localized fingertip sensations during object manipulation by neuroprostheses.
- Published
- 2020
- Full Text
- View/download PDF
34. Decoding Native Cortical Representations for Flexion and Extension at Upper Limb Joints Using Electrocorticography
- Author
-
William S. Anderson, Tessy M. Thomas, David P. McMullen, Nathan E. Crone, Nitish V. Thakor, Daniel N. Candrea, and Matthew S. Fifer
- Subjects
Adult ,Male ,Wrist Joint ,medicine.medical_specialty ,Adolescent ,Computer science ,Elbow ,Biomedical Engineering ,Wrist ,Article ,Fingers ,Machine Learning ,Upper Extremity ,03 medical and health sciences ,Young Adult ,0302 clinical medicine ,Physical medicine and rehabilitation ,Elbow Joint ,Internal Medicine ,medicine ,Humans ,Electrocorticography ,030304 developmental biology ,Brain–computer interface ,Cued speech ,0303 health sciences ,medicine.diagnostic_test ,General Neuroscience ,Rehabilitation ,Extension (predicate logic) ,medicine.anatomical_structure ,Brain-Computer Interfaces ,Linear Models ,Upper limb ,Feasibility Studies ,Female ,Joints ,Sensorimotor Cortex ,030217 neurology & neurosurgery ,Decoding methods ,Photic Stimulation - Abstract
Brain–machine interface (BMI) researchers have traditionally focused on modeling endpoint reaching tasks to provide the control of neurally driven prosthetic arms. Most previous research has focused on achieving an endpoint control through a Cartesian-coordinate-centered approach. However, a joint-centered approach could potentially be used to intuitively control a wide range of limb movements. We systematically investigated the feasibility of discriminating between flexion and extension of different upper limb joints using electrocorticography(ECoG) recordings from sensorimotor cortex. Four subjects implanted with macro-ECoG (10-mm spacing), high-density ECoG (5-mm spacing), and/or micro-ECoG arrays (0.9-mm spacing and 4 mm $\times $ 4 mm coverage), performed randomly cued flexions or extensions of the fingers, wrist, or elbow contralateral to the implanted hemisphere. We trained a linear model to classify six movements using averaged high-gamma power (70–110 Hz) modulations at different latencies with respect to movement onset, and within a time interval restricted to flexion or extension at each joint. Offline decoding models for each subject classified these movements with accuracies of 62%–83%. Our results suggest that the widespread ECoG coverage of sensorimotor cortex could allow a whole limb BMI to sample native cortical representations in order to control flexion and extension at multiple joints.
- Published
- 2019
35. Cortical subnetwork dynamics during human language tasks
- Author
-
Yujing Wang, Matthew S. Fifer, Griffin Milsap, Nathan E. Crone, Maxwell Collard, Heather L. Benz, Anna Korzeniewska, and David P. McMullen
- Subjects
Male ,Elementary cognitive task ,Nerve net ,Computer science ,Cognitive Neuroscience ,Speech recognition ,Models, Neurological ,Population ,Article ,050105 experimental psychology ,Visual processing ,Young Adult ,03 medical and health sciences ,0302 clinical medicine ,medicine ,Humans ,Speech ,Computer Simulation ,0501 psychology and cognitive sciences ,education ,Subnetwork ,Electrocorticography ,Dynamic Bayesian network ,Language ,Cerebral Cortex ,Brain Mapping ,education.field_of_study ,medicine.diagnostic_test ,Functional connectivity ,05 social sciences ,Speech processing ,Task (computing) ,medicine.anatomical_structure ,Reading ,Neurology ,Cerebral cortex ,Female ,Nerve Net ,030217 neurology & neurosurgery - Abstract
Language tasks require the coordinated activation of multiple subnetworks—groups of related cortical interactions involved in specific components of task processing. Although electrocorticography (ECoG) has sufficient temporal and spatial resolution to capture the dynamics of event-related interactions between cortical sites, it is difficult to decompose these complex spatiotemporal patterns into functionally discrete subnetworks without explicit knowledge of each subnetwork’s timing. We hypothesized that subnetworks corresponding to distinct components of task-related processing could be identified as groups of interactions with co-varying strengths. In this study, five subjects implanted with ECoG grids over language areas performed word repetition and picture naming. We estimated the interaction strength between each pair of electrodes during each task using a time-varying dynamic Bayesian network (tvDBN) model constructed from the power of high gamma (70–110 Hz) activity, a surrogate for population firing rates. We then reduced the dimensionality of this model using principal component analysis (PCA) to identify groups of interactions with co-varying strengths, which we term functional network components (FNCs). This data-driven technique estimates both the weight of each interaction’s contribution to a particular subnetwork, and the temporal profile of each subnetwork’s activation during the task. We found FNCs with temporal and anatomical features consistent with articulatory preparation in both tasks, and with auditory and visual processing in the word repetition and picture naming tasks, respectively. These FNCs were highly consistent between subjects with similar electrode placement, and were robust enough to be characterized in single trials. Furthermore, the interaction patterns uncovered by FNC analysis correlated well with recent literature suggesting important functional-anatomical distinctions between processing external and self-produced speech. Our results demonstrate that subnetwork decomposition of event-related cortical interactions is a powerful paradigm for interpreting the rich dynamics of large-scale, distributed cortical networks during human cognitive tasks.
- Published
- 2016
- Full Text
- View/download PDF
36. Sensory stimulation enhances phantom limb perception and movement decoding
- Author
-
Gyorgy Levay, Gordon Cheng, Keqin Ding, Anastasios Bezerianos, Andrei Dragomir, Matthew S. Fifer, Mark M. Iskarous, Christopher L. Hunt, Zied Tayeb, Nitish V. Thakor, Luke Osborn, Rohit Bose, Mark A. Hays, and Robert S. Armiger
- Subjects
medicine.medical_specialty ,Movement ,medicine.medical_treatment ,media_common.quotation_subject ,0206 medical engineering ,Biomedical Engineering ,Phantom limb ,Artificial Limbs ,Stimulation ,02 engineering and technology ,Electroencephalography ,Transcutaneous electrical nerve stimulation ,Prosthesis ,Article ,Imaging phantom ,law.invention ,03 medical and health sciences ,Cellular and Molecular Neuroscience ,0302 clinical medicine ,Physical medicine and rehabilitation ,law ,Perception ,Humans ,Medicine ,030304 developmental biology ,media_common ,0303 health sciences ,Sensory stimulation therapy ,medicine.diagnostic_test ,business.industry ,Hand ,equipment and supplies ,medicine.disease ,020601 biomedical engineering ,body regions ,Phantom Limb ,business ,030217 neurology & neurosurgery - Abstract
ObjectiveA major challenge for controlling a prosthetic arm is communication between the device and the user’s phantom limb. We show the ability to enhance amputees’ phantom limb perception and improve movement decoding through targeted transcutaneous electrical nerve stimulation (tTENS).ApproachTranscutaneous nerve stimulation experiments were performed with four amputee participants to map phantom limb perception. We measured myoelectric signals during phantom hand movements before and after amputees received sensory stimulation. Using electroencephalogram (EEG) monitoring, we measure the neural activity in sensorimotor regions during phantom movements and stimulation. In one participant, we also tracked sensory mapping over 2 years and movement decoding performance over 1 year.Main resultsResults show improvements in the amputees’ ability to perceive and move the phantom hand as a result of sensory stimulation, which leads to improved movement decoding. In the extended study with one amputee, we found that sensory mapping remains stable over 2 years. Remarkably, sensory stimulation improves within-day movement decoding while performance remains stable over 1 year. From the EEG, we observed cortical correlates of sensorimotor integration and increased motor-related neural activity as a result of enhanced phantom limb perception.SignificanceThis work demonstrates that phantom limb perception influences prosthesis control and can benefit from targeted nerve stimulation. These findings have implications for improving prosthesis usability and function due to a heightened sense of the phantom hand.
- Published
- 2020
- Full Text
- View/download PDF
37. Changes in human brain dynamics during behavioral priming and repetition suppression
- Author
-
Stephen J. Gotts, Alex Martin, Griffin Milsap, Anna Korzeniewska, Nathan E. Crone, Mackenzie C. Cervenka, Heather L. Benz, Max Collard, Yujing Wang, and Matthew S. Fifer
- Subjects
Adult ,0301 basic medicine ,media_common.quotation_subject ,Stimulus (physiology) ,Article ,Perceptual stimulus ,03 medical and health sciences ,0302 clinical medicine ,Perception ,Repetition Priming ,Reaction Time ,medicine ,Humans ,Speech ,Evoked Potentials ,Electrocorticography ,media_common ,Cerebral Cortex ,Predictive coding ,Epilepsy ,medicine.diagnostic_test ,Functional Neuroimaging ,General Neuroscience ,Human brain ,Neurophysiology ,Brain Waves ,Implicit learning ,030104 developmental biology ,medicine.anatomical_structure ,Pattern Recognition, Visual ,Nerve Net ,Psychology ,Neuroscience ,Psychomotor Performance ,030217 neurology & neurosurgery - Abstract
Behavioral responses to a perceptual stimulus are typically faster with repeated exposure to the stimulus (behavioral priming). This implicit learning mechanism is critical for survival but impaired in a variety of neurological disorders, including Alzheimer's disease. Many studies of the neural bases for behavioral priming have encountered an interesting paradox: in spite of faster behavioral responses, repeated stimuli usually elicit weaker neural responses (repetition suppression). Several neurophysiological models have been proposed to resolve this paradox, but noninvasive techniques for human studies have had insufficient spatial-temporal precision for testing their predictions. Here, we used the unparalleled precision of electrocorticography (ECoG) to analyze the timing and magnitude of task-related changes in neural activation and propagation while patients named novel vs repeated visual objects. Stimulus repetition was associated with faster verbal responses and decreased neural activation (repetition suppression) in ventral occipito-temporal cortex (VOTC) and left prefrontal cortex (LPFC). Interestingly, we also observed increased neural activation (repetition enhancement) in LPFC and other recording sites. Moreover, with analysis of high gamma propagation we observed increased top-down propagation from LPFC into VOTC, preceding repetition suppression. The latter results indicate that repetition suppression and behavioral priming are associated with strengthening of top-down network influences on perceptual processing, consistent with predictive coding models of repetition suppression, and they support a central role for changes in large-scale cortical dynamics in achieving more efficient and rapid behavioral responses.
- Published
- 2020
- Full Text
- View/download PDF
38. Beyond intuitive anthropomorphic control: recent achievements using brain computer interface technologies
- Author
-
Brock A. Wester, Matthew S. Johannes, Eric A. Pohlmeyer, Johnathan Pino, Matthew Rich, John B. Helder, Chris Dohopolski, Michael P. McLoughlin, James D. Beaty, Denise D'Angelo, Matthew S. Fifer, Sliman J. Bensmaia, and Francesco Tenore
- Subjects
0301 basic medicine ,03 medical and health sciences ,030104 developmental biology ,0302 clinical medicine ,Intracortical microstimulation ,Human–computer interaction ,Computer science ,Interface (computing) ,Control (management) ,Prosthetic limb ,030217 neurology & neurosurgery ,Brain–computer interface ,Task (project management) - Abstract
Brain-computer interface (BCI) research has progressed rapidly, with BCIs shifting from animal tests to human demonstrations of controlling computer cursors and even advanced prosthetic limbs, the latter having been the goal of the Revolutionizing Prosthetics (RP) program. These achievements now include direct electrical intracortical microstimulation (ICMS) of the brain to provide human BCI users feedback information from the sensors of prosthetic limbs. These successes raise the question of how well people would be able to use BCIs to interact with systems that are not based directly on the body (e.g., prosthetic arms), and how well BCI users could interpret ICMS information from such devices. If paralyzed individuals could use BCIs to effectively interact with such non-anthropomorphic systems, it would offer them numerous new opportunities to control novel assistive devices. Here we explore how well a participant with tetraplegia can detect infrared (IR) sources in the environment using a prosthetic arm mounted camera that encodes IR information via ICMS. We also investigate how well a BCI user could transition from controlling a BCI based on prosthetic arm movements to controlling a flight simulator, a system with different physical dynamics than the arm. In that test, the BCI participant used environmental information encoded via ICMS to identify which of several upcoming flight routes was the best option. For both tasks, the BCI user was able to quickly learn how to interpret the ICMSprovided information to achieve the task goals.
- Published
- 2017
- Full Text
- View/download PDF
39. Brain-Machine Interface Development for Finger Movement Control
- Author
-
Nitish V. Thakor, David P. McMullen, Tessy M. Lal, Guy Hotson, William S. Anderson, Matthew P. Para, Kapil D. Katyal, Brock A. Wester, Matthew S. Johannes, Robert S. Armiger, Nathan E. Crone, and Matthew S. Fifer
- Subjects
medicine.medical_specialty ,medicine.diagnostic_test ,0206 medical engineering ,02 engineering and technology ,Human brain ,020601 biomedical engineering ,Signal acquisition ,03 medical and health sciences ,Finger movement ,0302 clinical medicine ,medicine.anatomical_structure ,Physical medicine and rehabilitation ,medicine ,Neural control ,Psychology ,Control (linguistics) ,Electrocorticography ,030217 neurology & neurosurgery ,Movement control ,Brain–computer interface - Abstract
There have been many developments in brain-machine interfaces (BMI) for controlling upper limb movements such as reaching and grasping. One way to expand the usefulness of BMIs in replacing motor functions for patients with spinal cord injuries and neuromuscular disorders would be to improve the dexterity of upper limb movements performed by including more control of individual finger movements. Many studies have been focusing on understanding the organization of movement control in the sensorimotor cortex of the human brain. Finding the specific mechanisms for neural control of different movements will help focus signal acquisition and processing so as to improve BMI control of complex actions. In a recently published study, we demonstrated, for the first time, online BMI control of individual finger movements using electrocorticography recordings from the hand area of sensorimotor cortex. This study expands the possibilities for combined control of arm movements and more dexterous hand and finger movements.
- Published
- 2017
- Full Text
- View/download PDF
40. Demonstration of a Semi-Autonomous Hybrid Brain–Machine Interface Using Human Intracranial EEG, Eye Tracking, and Computer Vision to Control a Robotic Upper Limb Prosthetic
- Author
-
Guy Hotson, Brock A. Wester, William S. Anderson, Matthew S. Fifer, Nitish V. Thakor, Matthew S. Johannes, Nathan E. Crone, Alan Ravitz, Timothy G. McGee, Kapil D. Katyal, Andrew L. Harris, David P. McMullen, and R. Jacob Vogelstein
- Subjects
Adult ,Male ,Eye Movements ,Computer science ,Biomedical Engineering ,Artificial Limbs ,Pilot Projects ,Prosthesis Design ,Article ,Task (project management) ,Supervisory control ,Artificial Intelligence ,Internal Medicine ,Humans ,Computer vision ,Man-Machine Systems ,Simulation ,Brain–computer interface ,business.industry ,General Neuroscience ,Rehabilitation ,Eye movement ,Electroencephalography ,Robotics ,Object (computer science) ,Equipment Failure Analysis ,Brain-Computer Interfaces ,Therapy, Computer-Assisted ,Eye tracking ,Female ,Augmented reality ,Artificial intelligence ,business - Abstract
To increase the ability of brain-machine interfaces (BMIs) to control advanced prostheses such as the modular prosthetic limb (MPL), we are developing a novel system: the Hybrid Augmented Reality Multimodal Operation Neural Integration Environment (HARMONIE). This system utilizes hybrid input, supervisory control, and intelligent robotics to allow users to identify an object (via eye tracking and computer vision) and initiate (via brain-control) a semi-autonomous reach-grasp-and-drop of the object by the MPL. Sequential iterations of HARMONIE were tested in two pilot subjects implanted with electrocorticographic (ECoG) and depth electrodes within motor areas. The subjects performed the complex task in 71.4% (20/28) and 67.7% (21/31) of trials after minimal training. Balanced accuracy for detecting movements was 91.1% and 92.9%, significantly greater than chance accuracies (p < 0.05). After BMI-based initiation, the MPL completed the entire task 100% (one object) and 70% (three objects) of the time. The MPL took approximately 12.2 seconds for task completion after system improvements implemented for the second subject. Our hybrid-BMI design prevented all but one baseline false positive from initiating the system. The novel approach demonstrated in this proof-of-principle study, using hybrid input, supervisory control, and intelligent robotics, addresses limitations of current BMIs.
- Published
- 2014
- Full Text
- View/download PDF
41. Individual finger control of a modular prosthetic limb using high-density electrocorticography in a human subject
- Author
-
Nathan E. Crone, Matthew S. Fifer, Robert S. Armiger, Brock A. Wester, David P. McMullen, Matthew P. Para, Guy Hotson, Matthew S. Johannes, Kapil D. Katyal, William S. Anderson, and Nitish V. Thakor
- Subjects
Male ,Speech recognition ,Movement ,0206 medical engineering ,Biomedical Engineering ,Sensory system ,Artificial Limbs ,02 engineering and technology ,Vibration ,Article ,Fingers ,03 medical and health sciences ,Cellular and Molecular Neuroscience ,User-Computer Interface ,Young Adult ,0302 clinical medicine ,Ring finger ,medicine ,Humans ,Electrocorticography ,Brain–computer interface ,Cued speech ,medicine.diagnostic_test ,business.industry ,Modular design ,Linear discriminant analysis ,020601 biomedical engineering ,Numerical digit ,Electrodes, Implanted ,body regions ,medicine.anatomical_structure ,Brain-Computer Interfaces ,Sensorimotor Cortex ,Psychology ,business ,030217 neurology & neurosurgery - Abstract
Objective. We used native sensorimotor representations of fingers in a brain–machine interface (BMI) to achieve immediate online control of individual prosthetic fingers. Approach. Using high gamma responses recorded with a high-density electrocorticography (ECoG) array, we rapidly mapped the functional anatomy of cued finger movements. We used these cortical maps to select ECoG electrodes for a hierarchical linear discriminant analysis classification scheme to predict: (1) if any finger was moving, and, if so, (2) which digit was moving. To account for sensory feedback, we also mapped the spatiotemporal activation elicited by vibrotactile stimulation. Finally, we used this prediction framework to provide immediate online control over individual fingers of the Johns Hopkins University Applied Physics Laboratory modular prosthetic limb. Main results. The balanced classification accuracy for detection of movements during the online control session was 92% (chance: 50%). At the onset of movement, finger classification was 76% (chance: 20%), and 88% (chance: 25%) if the pinky and ring finger movements were coupled. Balanced accuracy of fully flexing the cued finger was 64%, and 77% had we combined pinky and ring commands. Offline decoding yielded a peak finger decoding accuracy of 96.5% (chance: 20%) when using an optimized selection of electrodes. Offline analysis demonstrated significant finger-specific activations throughout sensorimotor cortex. Activations either prior to movement onset or during sensory feedback led to discriminable finger control. Significance. Our results demonstrate the ability of ECoG-based BMIs to leverage the native functional anatomy of sensorimotor cortical populations to immediately control individual finger movements in real time.
- Published
- 2016
42. Toward Electrocorticographic Control of a Dexterous Upper Limb Prosthesis: Building Brain-Machine Interfaces
- Author
-
Mohsen Mollazadeh, Nitish V. Thakor, Matthew S. Fifer, Soumyadipta Acharya, Heather L. Benz, and Nathan E. Crone
- Subjects
Engineering ,medicine.diagnostic_test ,business.industry ,Remote patient monitoring ,Neural Prosthesis ,GRASP ,Biomedical Engineering ,Biomechanics ,Upper limb prosthesis ,General Medicine ,Electroencephalography ,Neurophysiology ,body regions ,medicine ,business ,neoplasms ,Brain–computer interface ,Biomedical engineering - Abstract
In this paper, an ECoG-based system for controlling the MPL where patients were implanted with ECoG electrode grids for clinical seizure mapping and asked to perform various recorded finger or grasp movements.
- Published
- 2012
- Full Text
- View/download PDF
43. Neuroprosthetic limb control with electrocorticography: approaches and challenges
- Author
-
Heather L. Benz, Griffin Milsap, Geoffrey I. Newman, Nitish V. Thakor, Nathan E. Crone, Matthew S. Fifer, and Guy Hotson
- Subjects
Cerebral Cortex ,Engineering ,medicine.diagnostic_test ,business.industry ,Movement ,Prosthetic limb ,Action Potentials ,Prostheses and Implants ,Upper Extremity ,Cortical control ,Brain-Computer Interfaces ,medicine ,Animals ,Humans ,Cortical surface ,Electrocorticography ,business ,Neuroscience ,Electrodes - Abstract
Advanced upper limb prosthetics, such as the Johns Hopkins Applied Physics Lab Modular Prosthetic Limb (MPL), are now available for research and preliminary clinical applications. Research attention has shifted to developing means of controlling these prostheses. Penetrating microelectrode arrays are often used in animal and human models to decode action potentials for cortical control. These arrays may suffer signal loss over the long-term and therefore should not be the only implant type investigated for chronic BMI use. Electrocorticographic (ECoG) signals from electrodes on the cortical surface may provide more stable long-term recordings. Several studies have demonstrated ECoG's potential for decoding cortical activity. As a result, clinical studies are investigating ECoG encoding of limb movement, as well as its use for interfacing with and controlling advanced prosthetic arms. This overview presents the technical state of the art in the use of ECoG in controlling prostheses. Technical limitations of the current approach and future directions are also presented.
- Published
- 2015
44. Semi-autonomous Hybrid Brain-Machine Interface
- Author
-
Michael P. McLoughlin, Matthew S. Fifer, William S. Anderson, David P. McMullen, Brock A. Wester, Timothy G. McGee, Nitish V. Thakor, Kapil D. Katyal, Nathan E. Crone, Matthew S. Johannes, Alan Ravitz, Guy Hotson, and Andrew L. Harris
- Subjects
Supervisory control ,business.industry ,Computer science ,Human–computer interaction ,Control (management) ,Eye tracking ,Augmented reality ,Robotics ,Artificial intelligence ,Modular design ,Object (computer science) ,business ,Brain–computer interface - Abstract
Although advanced prosthetic limbs, such as the modular prosthetic limb (MPL), are now capable of mimicking the dexterity of human limbs, brain-machine interfaces (BMIs) are not yet able to take full advantage of their capabilities. To improve BMI control of the MPL, we are developing a semi-autonomous system, the Hybrid Augmented Reality Multimodal Operation Neural Integration Environment (HARMONIE). This system is designed to utilize novel control strategies including hybrid input (adding eye tracking to neural control), supervisory control (decoding high-level patient goals), and intelligent robotics (incorporating computer vision and route planning algorithms). Patients use eye gaze to indicate a desired object that has been recognized by computer vision. They then perform a desired action, such as reaching and grasping, which is decoded and carried out by the MPL via route planning algorithms. Here we present two patients, implanted with electrocorticography (ECoG) and depth electrodes, who controlled the HARMONIE system to perform reach and grasping tasks; in addition, one patient also used the HARMONIE system to simulate self-feeding. This work builds upon prior research to demonstrate the feasibility of using novel control strategies to enable patients to perform a wider variety of activities of daily living (ADLs).
- Published
- 2015
- Full Text
- View/download PDF
45. Simultaneous neural control of simple reaching and grasping with the modular prosthetic limb using intracranial EEG
- Author
-
Matthew S. Johannes, Guy Hotson, David P. McMullen, R. Jacob Vogelstein, Matthew P. Para, John B. Helder, Kapil D. Katyal, William S. Anderson, Nitish V. Thakor, Brock A. Wester, Yujing Wang, Nathan E. Crone, and Matthew S. Fifer
- Subjects
Adult ,Male ,Computer science ,Biomedical Engineering ,Prosthetic limb ,Artificial Limbs ,Electroencephalography ,Online Systems ,Article ,Hand strength ,Internal Medicine ,medicine ,Neural control ,Humans ,Computer vision ,medicine.diagnostic_test ,Anthropometry ,Hand Strength ,business.industry ,General Neuroscience ,Rehabilitation ,Reproducibility of Results ,Modular design ,Neurophysiology ,Middle Aged ,Electrodes, Implanted ,Functional mapping ,Gait analysis ,Female ,Artificial intelligence ,business ,Psychomotor Performance - Abstract
Intracranial electroencephalographic (iEEG) signals from two human subjects were used to achieve simultaneous neural control of reaching and grasping movements with the Johns Hopkins University Applied Physics Lab (JHU/APL) Modular Prosthetic Limb (MPL), a dexterous robotic prosthetic arm. We performed functional mapping of high gamma activity while the subject made reaching and grasping movements to identify task-selective electrodes. Independent, online control of reaching and grasping was then achieved using high gamma activity from a small subset of electrodes with a model trained on short blocks of reaching and grasping with no further adaptation. Classification accuracy did not decline (p
- Published
- 2013
46. Design and implementation of a human ECoG simulator for testing brain-machine interfaces
- Author
-
Matthew S. Fifer, Nitish V. Thakor, William S. Anderson, David P. McMullen, Elliot Greenwald, Nathan E. Crone, Ramana Vinjamuri, and Griffin Milsap
- Subjects
Modulation ,Computer science ,Interface (computing) ,Amplifier ,Frequency domain ,Spectrogram ,Time domain ,Signal ,Simulation ,Brain–computer interface - Abstract
This paper presents the design and implementation of a signal simulator that emulates event-related human electrocorticographic (ECoG) signals. This real-time simulator renders a representative model of human ECoG encompassing prominent physiological modulation in the time domain (e.g., event-related potentials, or ERPs) and the frequency domain (e.g., alpha/mu, beta, and high gamma band). The simulated signals were generated in a MATLAB SIMULINK framework and output through a National Instruments PCI card for recording by a standard research-grade ECoG amplifier system. Trial-averaged event-related spectrograms computed offline from simulated signals exhibit characteristics similar to those of experimental human ECoG recordings. The presented simulator can serve as a useful tool for testing real-time brain-machine interface (BMI) applications. It can also serve as a potential framework for future implementation of neuronal models for generation of extracellular field potentials.
- Published
- 2013
- Full Text
- View/download PDF
47. HARMONIE: A multimodal control framework for human assistive robotics
- Author
-
David P. McMullen, R. Jacob Vogelstein, Kapil D. Katyal, Nathan E. Crone, Matthew S. Johannes, Brock A. Wester, Alex H. Firpi, Andrew J. Harris, Timothy G. McGee, Robert S. Armiger, Guy Hotson, and Matthew S. Fifer
- Subjects
Computer science ,business.industry ,Robot end effector ,law.invention ,law ,Control system ,User control ,Joystick ,Eye tracking ,Computer vision ,Augmented reality ,Artificial intelligence ,business ,Brain–computer interface ,Graphical user interface - Abstract
Effective user control of highly dexterous and robotic assistive devices requires intuitive and natural modalities. Although surgically implanted brain-computer interfaces (BCIs) strive to achieve this, a number of non-invasive engineering solutions may provide a quicker path to patient use by eliminating surgical implantation. We present the development of a semi-autonomous control system that utilizes computer vision, prosthesis feedback, effector centric device control, smooth movement trajectories, and appropriate hand conformations to interact with objects of interest. Users can direct a prosthetic limb through an intuitive graphical user interface to complete multi-stage tasks using patient appropriate combinations of control inputs such as eye tracking, conventional prosthetic controls/joysticks, surface electromyography (sEMG) signals, and neural interfaces (ECoG, EEG). Aligned with activities of daily living (ADL), these tasks include directing the prosthetic to specific locations or objects, grasping of objects by modulating hand conformation, and action upon grasped objects such as self-feeding. This Hybrid Augmented Reality Multimodal Operation Neural Integration Environment (HARMONIE) semi-autonomous control system lowers the user's cognitive load, leaving the bulk of command and control of the device to the computer. This flexible and intuitive control system could serve patient populations ranging from wheelchair-bound quadriplegics to upper-limb amputees.
- Published
- 2013
- Full Text
- View/download PDF
48. Listening to the music of the brain: Live analysis of ECoG recordings using digital audio workstation software
- Author
-
Matthew S. Fifer, Nitish V. Thakor, Nathan E. Crone, and Griffin Milsap
- Subjects
Signal processing ,Computer science ,business.industry ,Speech recognition ,Process (computing) ,Drag and drop ,computer.software_genre ,Pure Data ,Software ,Sonification ,Computer graphics (images) ,business ,Audio signal processing ,computer ,computer.programming_language ,Brain–computer interface - Abstract
A process is presented for analyzing electrocorticographic (ECoG) recordings and prototyping brain computer interfaces in which complex signal processing chains are able to be rapidly developed and iterated in digital audio workstation (DAW) software. DAW software includes many built-in “drag and drop” blocks that perform common, low-level signal processing algorithms such as filtering and envelope extraction. In addition to being optimized for real-time performance, DAW software also produces audio output, allowing for listening to raw and processed signals. Hearing these sonifications can impart new insights that may not be apparent in purely visual representations. A simple functional mapping analysis is performed in a DAW called Pure Data and compared to the results from a more traditional spatiotemporal analysis in MATLAB. Channels exhibiting qualitative activation in the resulting functional maps were further analyzed in another DAW called Renoise, wherein several high frequency (i.e., >400 Hz) features were observed. This study demonstrates an example use of DAW software, which we suggest is an easy-to-use and intuitive environment for real-time exploratory analyses and sophisticated sonification of ECoG recordings.
- Published
- 2013
- Full Text
- View/download PDF
49. Electrocorticographic decoding of ipsilateral reach in the setting of contralateral arm weakness from a cortical lesion
- Author
-
Matthew S. Fifer, Nathan E. Crone, Soumyadipta Acharya, Nitish V. Thakor, William S. Anderson, and Guy Hotson
- Subjects
Adult ,Male ,Movement ,Electroencephalography ,Brain mapping ,Cortex (anatomy) ,Motor system ,medicine ,Humans ,neoplasms ,Brain–computer interface ,Brain Mapping ,Epilepsy ,medicine.diagnostic_test ,Motor Cortex ,Neurophysiology ,Spinal cord ,Evoked Potentials, Motor ,Paresis ,medicine.anatomical_structure ,Brain-Computer Interfaces ,Arm ,Psychology ,Neuroscience ,Algorithms ,Motor cortex - Abstract
Brain machine interfaces have the potential for restoring motor function not only in patients with amputations or lesions of efferent pathways in the spinal cord and peripheral nerves, but also patients with acquired brain lesions such as strokes and tumors. In these patients the most efficient components of cortical motor systems are not available for BMI control. Here we had the opportunity to investigate the possibility of utilizing subdural electrocorticographic (ECoG) signals to control natural reaching movements under these circumstances. In a subject with a left arm monoparesis following resection of a recurrent glioma, we found that ECoG signals recorded in remaining cortex were sufficient for decoding kinematics of natural reach movements of the nonparetic arm, ipsilateral to the ECoG recordings. The relationship between the subject's ECoG signals and reach trajectory in three dimensions, two of which were highly correlated, was captured with a computationally simple linear model (mean Pearson's r in depth dimension= 0.68, in height= 0.73, in lateral= 0.24). These results were attained with only a small subset of 7 temporal/spectral neural signal features. The small subset of neural features necessary to attain high decoding results show promise for a restorative BMI controlled solely by ipsilateral ECoG signals.
- Published
- 2013
50. Spatial-temporal functional mapping of language at the bedside with electrocorticographyAuthor Response
- Author
-
Adeen Flinker, William S. Anderson, Mackenzie C. Cervenka, Dana Boatman-Reich, Anna Korzeniewska, James W. Wheless, Matthew S. Fifer, Nathan E. Crone, Abbas Babajani-Feremi, James W. Papanicolaou, and Yujing Wang
- Subjects
medicine.medical_specialty ,media_common.quotation_subject ,Electrocortical Stimulation Mapping ,Dopamine agonist ,050105 experimental psychology ,03 medical and health sciences ,0302 clinical medicine ,Physical medicine and rehabilitation ,mental disorders ,Medicine ,0501 psychology and cognitive sciences ,Restless legs syndrome ,Electrocorticography ,media_common ,medicine.diagnostic_test ,business.industry ,Addiction ,05 social sciences ,Forme fruste ,medicine.disease ,body regions ,Functional mapping ,Neurology (clinical) ,Withdrawal syndrome ,business ,030217 neurology & neurosurgery ,medicine.drug - Abstract
Editors' Note: In WriteClick this week, Dr. Nirenberg responds to “Augmentation and impulsive behaviors in restless legs syndrome: Coexistence or association?” with the hypothesis that augmentation in restless legs syndrome may be a forme fruste of dopamine agonist withdrawal syndrome. Authors Djamshidian et al. report no association between augmentation and medication nonadherence, but agree with monitoring these patients closely for addictive behaviors. In reference to “Spatial-temporal functional mapping of language at the bedside with electrocorticography,” Drs. Babajani-Feremi et al. and authors Wang et al. discuss the use of high-gamma electrocorticography, electrocortical stimulation mapping, …
- Published
- 2016
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.