• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Duan, Lijuan (Duan, Lijuan.) | Ma, Bian (Ma, Bian.) | Yin, Yue (Yin, Yue.) | Huang, Zhaoyang (Huang, Zhaoyang.) | Qiao, Yuanhua (Qiao, Yuanhua.)

Indexed by:

EI Scopus SCIE

Abstract:

Accurate automatic sleep staging is crucial for diagnosing sleep disorders. However, most existing automatic data-driven sleep staging methods could not perfectly learn the complex knowledge of sleep staging criteria such as the American Academy of Sleep Medicine (AASM) based on the limited labeled data. This paper proposes a novel multimodal and multiscale automatic sleep staging framework, MMS-SleepNet, which explicitly incorporates AASM knowledge. It employs a deep learning multimodal feature extraction module (MMS-FE), embedding expert knowledge to effectively capture multimodal features for each stage and fine-grained EEG features at various frequencies. The module utilizes an attention mechanism to seamlessly fuse extracted multimodal features, significantly enhancing classification accuracy. To further improve the performance of MMS-SleepNet, a contrastive learning module and a data balancing strategy are proposed, addressing class confusion and data imbalance issues in existing models. Specifically, the contrastive classification module (CCM) emphasizes intra-class similarity and inter-class disparity, effectively alleviating class confusion. A simple yet effective data balancing mechanism augments the number of samples for the N1 sleep stage, guaranteeing that the model is trained on a more balanced dataset and proficiently resolves the long-tail distribution problem stemming from class imbalance. Experimental results on two public datasets validate the effectiveness of MMS-SleepNet, achieving a remarkable accuracy of 92.9% on the Sleep-EDF-20 dataset, surpassing other methods. Notably, it attains a 74.1% accuracy in the challenging N1 stage, outperforming other methods by 19.6–49.2%. © 2024 Elsevier Ltd

Keyword:

Contrastive Learning Sleep research

Author Community:

  • [ 1 ] [Duan, Lijuan]College of Computer Science, Beijing University of Technology, Beijing; 100124, China
  • [ 2 ] [Duan, Lijuan]Beijing Key Laboratory of Trusted Computing, Beijing; 100124, China
  • [ 3 ] [Duan, Lijuan]China National Engineering Laboratory for Critical Technologies of Information Security Classified Protection, Beijing; 100124, China
  • [ 4 ] [Ma, Bian]College of Computer Science, Beijing University of Technology, Beijing; 100124, China
  • [ 5 ] [Ma, Bian]Beijing Key Laboratory of Trusted Computing, Beijing; 100124, China
  • [ 6 ] [Ma, Bian]China National Engineering Laboratory for Critical Technologies of Information Security Classified Protection, Beijing; 100124, China
  • [ 7 ] [Yin, Yue]College of Computer Science, Beijing University of Technology, Beijing; 100124, China
  • [ 8 ] [Yin, Yue]Beijing Key Laboratory of Trusted Computing, Beijing; 100124, China
  • [ 9 ] [Yin, Yue]China National Engineering Laboratory for Critical Technologies of Information Security Classified Protection, Beijing; 100124, China
  • [ 10 ] [Huang, Zhaoyang]Department of Neurology, Xuanwu Hospital, Capital Medical University, Beijing; 100053, China
  • [ 11 ] [Huang, Zhaoyang]Beijing Key Laboratory of Neuromodulation, Beijing; 100053, China
  • [ 12 ] [Qiao, Yuanhua]School of Mathematics, Statistics and Mechanics, Beijing University of Technology, Beijing; 100124, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

Biomedical Signal Processing and Control

ISSN: 1746-8094

Year: 2025

Volume: 103

5 . 1 0 0

JCR@2022

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 7

Affiliated Colleges:

Online/Total:499/10596488
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.