1. Dual-Branch Convolution Network With Efficient Channel Attention for EEG-Based Motor Imagery Classification
- Author
-
Kai Zhou, Aierken Haimudula, and Wanying Tang
- Subjects
Intelligent healthcare ,MI-EEG classification ,channel attention ,dual-branch convolutional network ,multi-head self-attention ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
Brain-Computer Interface (BCI) is a revolutionary technique that employs wearable electroencephalography (EEG) sensors and artificial intelligence (AI) to monitor and decode brain activity. EEG-based motor imagery (MI) brain signal is widely utilized in various BCI fields including intelligent healthcare, robot control, and smart homes. Yet, the limited capability of decoding brain signals remains a significant obstacle to BCI techniques expansion. In this study, we describe an architecture known as the dual-branch attention temporal convolutional network (DB-ATCNet) for EEG-based MI classification. DB-ATCNet improves MI classification performance with relatively fewer parameters by utilizing a dual-branch convolutional network and channel attention. The DB-ATCNet model consists of two primary modules: attention dual-branch convolution (ADBC) and attention temporal fusion convolution (ATFC). The ADBC module utilizes a dual-branch convolutional network to extract low-level MI-EEG features and incorporates channel attention to improve spatial feature extraction. ATFC employs sliding windows with self-attention to obtain the high-level temporal features, and utilizes feature fusion strategies to minimize information loss. The DB-ATCNet achieved subject-independent accuracies of 87.33% and 69.58% in two-class and four-class classification tasks, respectively, on the PhysioNet dataset. On the BCI Competition IV-2a dataset, it achieved an accuracy of 71.34% and 87.54% for subject-independent and subject-dependent evaluations, respectively, surpassing existing methods. The code is available at https://github.com/zk-xju/DB-ATCNet.
- Published
- 2024
- Full Text
- View/download PDF