• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Zuo, Guo-Yu (Zuo, Guo-Yu.) (Scholars:左国玉) | Xu, Zhao-Kun (Xu, Zhao-Kun.) | Lu, Jia-Hao (Lu, Jia-Hao.) | Gong, Dao-Xiong (Gong, Dao-Xiong.)

Indexed by:

EI Scopus CSCD

Abstract:

An SODDAG-SVM (structure-optimized decision directed acyclic graph-support vector machine) multi-classification action recognition method of upper limb rehabilitation training for the Brunnstrom 4 ~ 5 stage patients is proposed to solve the core problem of action recognition of the rehabilitation training evaluation method. First, the multi-classification problem is decomposed into a set of binary classification problems, support vector machine (SVM) method is used to construct each binary classifier, in which the SVM kernel function parameters and feature subsets of each binary classifiers are optimized by genetic algorithm and the feature subsets discrimination criterion, respectively. Then, the generalization errors of each SVM binary classifier are used to measure the separable degree of this class pair, and the upper triangulation matrix of generalization errors is built. Finally, from the root node, according to the generalization error matrix of each node, an SODDAG-SVM structure is constructed by choosing the SVM classifier of the most easily separated class pair as each node. When there are fewer instances to be predicted, a part of the SODDAG-SVM structure passed by these instances is directly built for predicting the instances. When more instances need to be predicted, a complete SODDAG-SVM structure is first constructed and then is used to predict all the instances. Action recognition experiment is performed on the upper limb routine rehabilitation training samples of the Brunnstrom 4 ~ 5 stage, acquired using human body sensing technology. Results show that the accuracy reaches 95.49 % which is higher than those of conventional decision directed acyceic graph (DDAG) and MaxWins methods. It is proved that the proposed method can effectively improve the accuracy of rehabilitation training action recognition. Copyright © 2020 Acta Automatica Sinica. All rights reserved.

Keyword:

Classification (of information) Genetic algorithms Directed graphs Neuromuscular rehabilitation Errors Support vector machines

Author Community:

  • [ 1 ] [Zuo, Guo-Yu]Faculty of Information Technology, Beijing University of Technology, Beijing; 100124, China
  • [ 2 ] [Zuo, Guo-Yu]Beijing Key Laboratory of Computational Intelligence and Intelligent System, Beijing; 100124, China
  • [ 3 ] [Xu, Zhao-Kun]Faculty of Information Technology, Beijing University of Technology, Beijing; 100124, China
  • [ 4 ] [Xu, Zhao-Kun]Beijing Key Laboratory of Computational Intelligence and Intelligent System, Beijing; 100124, China
  • [ 5 ] [Lu, Jia-Hao]Faculty of Information Technology, Beijing University of Technology, Beijing; 100124, China
  • [ 6 ] [Lu, Jia-Hao]Beijing Key Laboratory of Computational Intelligence and Intelligent System, Beijing; 100124, China
  • [ 7 ] [Gong, Dao-Xiong]Faculty of Information Technology, Beijing University of Technology, Beijing; 100124, China
  • [ 8 ] [Gong, Dao-Xiong]Beijing Key Laboratory of Computational Intelligence and Intelligent System, Beijing; 100124, China

Reprint Author's Address:

  • 左国玉

    [zuo, guo-yu]faculty of information technology, beijing university of technology, beijing; 100124, china;;[zuo, guo-yu]beijing key laboratory of computational intelligence and intelligent system, beijing; 100124, china

Show more details

Related Keywords:

Source :

Acta Automatica Sinica

ISSN: 0254-4156

Year: 2020

Issue: 3

Volume: 46

Page: 549-561

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count: 5

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 17

Online/Total:977/10681580
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.