• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Xue, B. (Xue, B..) | Zhu, C. (Zhu, C..) | Wang, X. (Wang, X..) | Zhu, W. (Zhu, W..)

Indexed by:

Scopus SCIE

Abstract:

Graph Convolutional Neural Network (GCN) is widely used in text classification tasks. Furthermore, it has been effectively used to accomplish tasks that are thought to have a rich relational structure. However, due to the sparse adjacency matrix constructed by GCN, GCN cannot make full use of context-dependent information in text classification, and it is not good at capturing local information. The Bidirectional Encoder Representation from Transformers (BERT) has the ability to capture contextual information in sentences or documents, but it is limited in capturing global (the corpus) information about vocabulary in a language, which is the advantage of GCN. Therefore, this paper proposes an improved model to solve the above problems. The original GCN uses word co-occurrence relationships to build text graphs. Word connections are not abundant enough and cannot capture context dependencies well, so we introduce a semantic dictionary and dependencies. While the model enhances the ability to capture contextual dependencies, it lacks the ability to capture sequences. Therefore, we introduced BERT and Bi-directional Long Short-Term Memory (BiLSTM) Network to perform deeper learning on the features of text, thereby improving the classification effect of the model. The experimental results show that our model is more effective than previous research reports on four text classification datasets. © 2022 by the authors.

Keyword:

text classification Bi-directional Long Short-Term Memory ResNet dependencies graph convolutional network

Author Community:

  • [ 1 ] [Xue, B.]Faculty of Information Technology, Beijing University of Technology, Beijing, 100020, China
  • [ 2 ] [Zhu, C.]Faculty of Information Technology, Beijing University of Technology, Beijing, 100020, China
  • [ 3 ] [Wang, X.]Faculty of Information Technology, Beijing University of Technology, Beijing, 100020, China
  • [ 4 ] [Zhu, W.]Faculty of Information Technology, Beijing University of Technology, Beijing, 100020, China

Reprint Author's Address:

  • [Xue, B.]Faculty of Information Technology, China

Show more details

Related Keywords:

Source :

Applied Sciences (Switzerland)

ISSN: 2076-3417

Year: 2022

Issue: 16

Volume: 12

2 . 7

JCR@2022

2 . 7 0 0

JCR@2022

ESI Discipline: ENGINEERING;

ESI HC Threshold:49

JCR Journal Grade:2

CAS Journal Grade:3

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count: 1

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 5

Affiliated Colleges:

Online/Total:630/10514357
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.