• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Liang, Y. (Liang, Y..) | Zhang, C. (Zhang, C..) | An, S. (An, S..) | Wang, Z. (Wang, Z..) | Shi, K. (Shi, K..) | Peng, T. (Peng, T..) | Ma, Y. (Ma, Y..) | Xie, X. (Xie, X..) | He, J. (He, J..) | Zheng, K. (Zheng, K..)

Indexed by:

EI Scopus SCIE

Abstract:

Objective. Electroencephalogram (EEG) analysis has always been an important tool in neural engineering, and the recognition and classification of human emotions are one of the important tasks in neural engineering. EEG data, obtained from electrodes placed on the scalp, represent a valuable resource of information for brain activity analysis and emotion recognition. Feature extraction methods have shown promising results, but recent trends have shifted toward end-to-end methods based on deep learning. However, these approaches often overlook channel representations, and their complex structures pose certain challenges to model fitting. Approach. To address these challenges, this paper proposes a hybrid approach named FetchEEG that combines feature extraction and temporal-channel joint attention. Leveraging the advantages of both traditional feature extraction and deep learning, the FetchEEG adopts a multi-head self-attention mechanism to extract representations between different time moments and channels simultaneously. The joint representations are then concatenated and classified using fully-connected layers for emotion recognition. The performance of the FetchEEG is verified by comparison experiments on a self-developed dataset and two public datasets. Main results. In both subject-dependent and subject-independent experiments, the FetchEEG demonstrates better performance and stronger generalization ability than the state-of-the-art methods on all datasets. Moreover, the performance of the FetchEEG is analyzed for different sliding window sizes and overlap rates in the feature extraction module. The sensitivity of emotion recognition is investigated for three- and five-frequency-band scenarios. Significance. FetchEEG is a novel hybrid method based on EEG for emotion classification, which combines EEG feature extraction with Transformer neural networks. It has achieved state-of-the-art performance on both self-developed datasets and multiple public datasets, with significantly higher training efficiency compared to end-to-end methods, demonstrating its effectiveness and feasibility. © 2024 IOP Publishing Ltd.

Keyword:

electroencephalographic deep learning self-attention mechanism. emotion recognition power spectral density

Author Community:

  • [ 1 ] [Liang Y.]Faculty of Information Technology, Beijing University of Technology, Beijing, China
  • [ 2 ] [Zhang C.]Faculty of Information Technology, Beijing University of Technology, Beijing, China
  • [ 3 ] [An S.]JD Health International Inc., Beijing, China
  • [ 4 ] [Wang Z.]Faculty of Information Technology, Beijing University of Technology, Beijing, China
  • [ 5 ] [Shi K.]University of Technology Sydney, Sydney, Australia
  • [ 6 ] [Peng T.]Beihang University, Beijing, China
  • [ 7 ] [Ma Y.]Beihang University, Beijing, China
  • [ 8 ] [Xie X.]Faculty of Information Technology, Beijing University of Technology, Beijing, China
  • [ 9 ] [He J.]Faculty of Information Technology, Beijing University of Technology, Beijing, China
  • [ 10 ] [Zheng K.]Faculty of Information Technology, Beijing University of Technology, Beijing, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Source :

Journal of Neural Engineering

ISSN: 1741-2560

Year: 2024

Issue: 3

Volume: 21

4 . 0 0 0

JCR@2022

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 12

Affiliated Colleges:

Online/Total:483/10584339
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.