1. FlexSleepTransformer: a transformer-based sleep staging model with flexible input channel configurations.
- Author
-
Guo, Yanchen, Nowakowski, Maciej, and Dai, Weiying
- Subjects
- *
ARTIFICIAL neural networks , *SLEEP stages , *DEEP learning , *TRANSFORMER models , *SLEEP - Abstract
Clinical sleep diagnosis traditionally relies on polysomnography (PSG) and expert manual classification of sleep stages. Recent advancements in deep learning have shown promise in automating sleep stage classification using a single PSG channel. However, variations in PSG acquisition devices and environments mean that the number of PSG channels can differ across sleep centers. To integrate a sleep staging method into clinical practice effectively, it must accommodate a flexible number of PSG channels. In this paper, we proposed FlexSleepTransformer, a transformer-based model designed to handle varying number of input channels, making it adaptable to diverse sleep staging datasets. We evaluated FlexSleepTransformer using two distinct datasets: the public SleepEDF-78 dataset and the local SleepUHS dataset. Notably, FlexSleepTransformer is the first model capable of simultaneously training on datasets with differing number of PSG channels. Our experiments showed that FlexSleepTransformer trained on both datasets together achieved 98% of the accuracy compared to models trained on each dataset individually. Furthermore, it outperformed models trained exclusively on one dataset when tested on the other dataset. Additionally, FlexSleepTransformer surpassed state-of-the-art CNN and RNN-based models on both datasets. Due to its adaptability with varying channels numbers, FlexSleepTransformer holds significant potential for clinical adoption, especially when trained with data from a wide range of sleep centers. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF