1. E2FNet: An EEG- and EMG-Based Fusion Network for Hand Motion Intention Recognition
- Author
-
Jiang, Guoqian, Wang, Kunyu, He, Qun, and Xie, Ping
- Abstract
In light of the growing population of individuals with limb disorders, there is an increasing need to address the challenges they face in their daily lives. Existing rehabilitation technologies, often relying on single physiological signals and plagued by poor signal quality, have limitations in their effectiveness. To overcome these constraints, we present E2FNet, a multimodal physiological information fusion network designed for motor intent recognition in individuals with limb disorders. This study involved eight healthy participants who recorded electromyography (EMG) and electroencephalography (EEG) signals during various hand movements. E2FNet utilizes a multiscale convolutional neural network to extract features from EEG and EMG data, focusing on information fusion across different scales. We also introduce a cross-attention mechanism to capture cross-modal information interactions, enhancing EEG and EMG information fusion. Through extensive experiments, E2FNet achieved an impressive 92.08% classification accuracy, and the effectiveness of each module has been verified. Multiscale separable convolution and cross-attention significantly improved EEG and EMG signal fusion, enhancing accuracy and robustness in motion intent recognition. This research promises to enhance the quality of life and independence of individuals with movement disorders, while also advancing the field of rehabilitation robotics and assistive technology.
- Published
- 2024
- Full Text
- View/download PDF