• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Li, Ming-ai (Li, Ming-ai.) (Scholars:李明爱) | Dong, Yu-xin (Dong, Yu-xin.) | Sun, Yan-jun (Sun, Yan-jun.) | Yang, Jin-fu (Yang, Jin-fu.) (Scholars:杨金福) | Duan, Li-juan (Duan, Li-juan.) (Scholars:段立娟)

Indexed by:

EI Scopus SCIE

Abstract:

In the BCI rehabilitation system, the decoding of motor imagery tasks (MI-tasks) with dipoles in the source domain has gradually become a new research focus. For complex multiclass MI-tasks, the number of activated dipoles is large, and the activation area, activation time and intensity are also different for different subjects. The means by which to identify fewer subject-based dipoles is very important. There exist two main methods of dipole selection: one method is based on the physiological functional partition theory, and the other method is based on human experience. However, the number of dipoles that are selected by the two methods is still large and contains information redundancy, and the selected dipoles are the same in both number and position for different subjects, which is not necessarily ideal for distinguishing different MI-tasks. In this paper, the data-driven method is used to preliminarily select fully activated dipoles with large amplitudes; the obtained dipoles are refined by using continuous wavelet transform (CWT) to best reflect the differences among the multiclass MI-tasks, thereby yielding a subject-based dipole selection method, which is named PRDS. PRDS is further used to decode multiclass MI-tasks in which some representative dipoles are found, and their wavelet coefficient power is calculated and input to one-vs.-one common spatial pattern (OVO-CSP) for feature extraction, and the features are classified by the support vector machine. We denote this decoding method as D-CWTCSP, which enhances the spatial resolution and also makes full use of the time-frequency-spatial domain information. Experiments are carried out using a public dataset with nine subjects and four classes of MI-tasks, and the proposed D-CWTCSP is compared with the relevant methods in sensor space and brain-source space in terms of the decoding accuracy, standard deviation, recall rate and kappa value. The experimental results show that D-CWTCSP reaches an average decoding accuracy of 82.66% among the nine subjects, which generates 8-20% improvement over other methods, thus reflecting its great superiority in decoding accuracy. (C) 2020 Elsevier B.V. All rights reserved.

Keyword:

Dipole selection EEG source imaging MI-tasks decoding Common spatial patterns Continuous wavelet transform

Author Community:

  • [ 1 ] [Li, Ming-ai]Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
  • [ 2 ] [Dong, Yu-xin]Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
  • [ 3 ] [Sun, Yan-jun]Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
  • [ 4 ] [Yang, Jin-fu]Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
  • [ 5 ] [Duan, Li-juan]Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
  • [ 6 ] [Li, Ming-ai]Beijing Key Lab Computat Intelligence & Intellige, Beijing, Peoples R China
  • [ 7 ] [Yang, Jin-fu]Beijing Key Lab Computat Intelligence & Intellige, Beijing, Peoples R China

Reprint Author's Address:

  • 李明爱

    [Li, Ming-ai]Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China

Show more details

Related Keywords:

Related Article:

Source :

NEUROCOMPUTING

ISSN: 0925-2312

Year: 2020

Volume: 402

Page: 195-208

6 . 0 0 0

JCR@2022

ESI Discipline: COMPUTER SCIENCE;

ESI HC Threshold:132

Cited Count:

WoS CC Cited Count: 8

SCOPUS Cited Count: 9

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 5

Online/Total:355/10554160
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.