Indexed by:
Abstract:
Sleep is the significant physiological process to keep healthy. Sleep stage classification based on polysom⁃ nography (PSG) is the fundamental evidence to diagnose sleep disorders and assess sleep quality. Manual sleep staging method has some typical problems when handling the large-scale PSG data, such as time-consuming and low-efficiency. The automatic sleep staging method that utilizing deep learning models to effectively learn PSG representations shows ex⁃ tensive researching prospects. Most existing models cannot fully consider the epoch-level waveform information, channel-wise correlations, sequence-level sleep transitions. This paper proposes a transformer-based hierarchical sleep staging model (HierFormer), which employs transformer encoder to extract valid epoch-level waveform features, channel-wise correlation features, sequence-level transition features. Meanwhile, it adopts attention mechanism to improve the model interpretability of signal properties from three views, including epoch-level, channel-wise, and sequence-level views. Experimental results on the sleep-european data format (sleep-EDF) database expanded dataset show that the proposed model achieves better sleep staging performance with less parameters compared with various baseline models. The overall accuracy, macro-aver⁃ aging precision, macro-averaging recall, macro-averaging F1-score, and Cohen’s-kappa coefficient achieve 0.807, 0.784, 0.735, 0.750, and 0.721, respectively. According to the performance comparisons of different feature encoding methods from three views and the visualization of attention weights, this paper further demonstrates the satisfied encoding ability and interpretability of proposed model. This study aims to provide innovative deep learning approaches and technologies for the research of sleep staging applications, thus assisting sleep experts to improve the efficiency of sleep disorder diagnosis and treatment. © 2025 Chinese Institute of Electronics. All rights reserved.
Keyword:
Reprint Author's Address:
Email:
Source :
Acta Electronica Sinica
ISSN: 0372-2112
Year: 2025
Issue: 2
Volume: 53
Page: 545-557
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 1
Affiliated Colleges: