IEEE Access (Jan 2023)

4s-SleepGCN: Four-Stream Graph Convolutional Networks for Sleep Stage Classification

  • Menglei Li,
  • Hongbo Chen,
  • Yong Liu,
  • Qiangfu Zhao

DOI
https://doi.org/10.1109/ACCESS.2023.3294410
Journal volume & issue
Vol. 11
pp. 70621 – 70634

Abstract

Read online

Sleep staging serves as a critical basis for assessing sleep quality and diagnosing sleep disorders in clinical practice. Most existing methods rely solely on a single channel for sleep staging, thereby neglecting the complementary nature of multimodal electrophysiological signal characteristics. In contrast, the current multi-stream sleep staging network primarily utilizes electrooculogram (EOG) and electroencephalogram (EEG) signals as inputs and efficiently fuses the extracted multimodal features. However, the importance of motion information in electrophysiological signals is rarely investigated, which could improve the classification performance. Moreover, recent sleep staging models have been plagued by issues of overparameterization and suboptimal classification accuracy. Moreover, EOG and EEG are non-Euclidean graph-structured data that can be effectively handled by graph convolutional networks. To address the aforementioned issues, we propose an efficient graph-based multi-stream model named 4s-SleepGCN, which combines biological signal features to classify sleep stages. In each single-stream model, the positional relationship of the modal sequences is incorporated into the proposed model to enhance the feature representation for sleep staging. On this basis, graph convolutions are utilized to capture spatial features, while multi-scale temporal convolutions are employed to model temporal dynamics and extract more discriminative contextual temporal features. The EEG signal, EOG signal, and corresponding motion information are separately fed into the single-stream model comprising our 4s-SleepGCN. Experimental results show that the proposed 4s-SleepGCN achieves the highest accuracy compared to state-of-the-art methods in both the Sleep-EDF-39 dataset (92.3%) and Sleep-EDF-153 dataset (85.5%). Additionally, we conduct numerous experiments on two representative datasets that demonstrate the validity of the motion modalities in sleep stage classification. Also, the proposed single-stream network shows higher accuracy (89.2% and 89.8%) in classification while requiring 33% fewer parameters. Our proposed 4s-SleepGCN model serves as a powerful tool to assist sleep experts in assessing sleep quality and diagnosing sleep-related diseases.

Keywords