Back to Search
Start Over
CTCNet: A CNN Transformer capsule network for sleep stage classification.
- Source :
-
Measurement (02632241) . Feb2024, Vol. 226, pN.PAG-N.PAG. 1p. - Publication Year :
- 2024
-
Abstract
- In this paper, we propose a novel neural network architecture called CTCNet. First, we adopt a multi-scale convolutional neural network (MSCNN) to extract low and high-frequency features, adaptive channel feature recalibration (ACFR) to enhance the model's sensitivity to important channel features in the feature maps and reduce dependence on irrelevant or redundant features, a multi-scale dilated convolutional block (MSDCB) to capture characteristics of different types among feature channels. Second, we use Transformer to extract global temporal context features. Third, we employ capsule network to capture spatial location relationships among EEG features and refine these features. Besides, the capsule network module is used as our model's classifier to classify the final results. It is worth noting that our model better solves the problem that previous researches failed to take into account the simultaneous extraction of local features and global temporal context characteristics of EEG signals, and ignored the spatial location relationships between these features. Eventually, we assess our model on three datasets and it achieves better or comparable performance than most state-of-the-art methods. • The proposed method extracts the local and global features of EEG signals. • We design a multi-scale CNN to extract different frequency features of EEG signals. • We use Transformer to capture global temporal context information of EEG signals. • We use capsule network to extract spatial relationships between EEG features. • The accuracy on Sleep-EDF-20 is 86.2%, on Sleep-EDF-78 is 82.5% and on SHHS is 85.7%. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 02632241
- Volume :
- 226
- Database :
- Academic Search Index
- Journal :
- Measurement (02632241)
- Publication Type :
- Academic Journal
- Accession number :
- 175297653
- Full Text :
- https://doi.org/10.1016/j.measurement.2024.114157