• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Xue, Bingxin (Xue, Bingxin.) | Zhu, Cui (Zhu, Cui.) | Wang, Xuan (Wang, Xuan.) | Zhu, Wenjun (Zhu, Wenjun.)

Indexed by:

EI Scopus

Abstract:

Recently, Graph Convolutional Neural Network (GCN) is widely used in text classification tasks. And it has been effectively used to accomplish tasks that are thought to have a rich relational structure. However, due to the sparse adjacency matrix constructed by GCN, GCN cannot make full use of context-dependent information in text classification, and it is not good at capturing local information. The Bidirectional Encoder Representation from Transformers (BERT) has the ability to capture contextual information in sentences or documents, but it is limited in capturing global information about vocabulary in a language, which is the advantage of GCN. Therefore, this paper proposes an improved model named Improved Mutual Graph Convolution Networks (IMGCN) to solve the above problems. The original GCN uses word co-occurrence relationships to build text graphs. Word connections are not rich enough and cannot capture context dependencies well, so we introduce semantic dictionary (WordNet) and dependencies. While the model enhances the ability to capture contextual dependencies, it lacks the ability to capture sequences. Therefore, we introduced BERT and Bi-directional Long Short-Term Memory (BiLSTM) Network to perform deeper learning on the features of text, thereby improving the classification effect of the model. The experimental results show that our model is more effective than previous research reports on four text classification datasets. © 2022 ACM.

Keyword:

Classification (of information) Text processing Long short-term memory Brain Convolution Semantics Convolutional neural networks Graph neural networks

Author Community:

  • [ 1 ] [Xue, Bingxin]Faculty of Information Technology, Beijing University of Technology, Chaoyang District, Beijing, China
  • [ 2 ] [Zhu, Cui]Faculty of Information Technology, Beijing University of Technology, Chaoyang District, Beijing, China
  • [ 3 ] [Wang, Xuan]Faculty of Information Technology, Beijing University of Technology, Chaoyang District, Beijing, China
  • [ 4 ] [Zhu, Wenjun]Faculty of Information Technology, Beijing University of Technology, Chaoyang District, Beijing, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

Year: 2022

Page: 323-331

Language: English

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count: 9

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 6

Affiliated Colleges:

Online/Total:542/10555226
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.