Indexed by:
Abstract:
Recently, Graph Convolutional Neural Network (GCN) is widely used in text classification tasks. And it has been effectively used to accomplish tasks that are thought to have a rich relational structure. However, due to the sparse adjacency matrix constructed by GCN, GCN cannot make full use of context-dependent information in text classification, and it is not good at capturing local information. The Bidirectional Encoder Representation from Transformers (BERT) has the ability to capture contextual information in sentences or documents, but it is limited in capturing global information about vocabulary in a language, which is the advantage of GCN. Therefore, this paper proposes an improved model named Improved Mutual Graph Convolution Networks (IMGCN) to solve the above problems. The original GCN uses word co-occurrence relationships to build text graphs. Word connections are not rich enough and cannot capture context dependencies well, so we introduce semantic dictionary (WordNet) and dependencies. While the model enhances the ability to capture contextual dependencies, it lacks the ability to capture sequences. Therefore, we introduced BERT and Bi-directional Long Short-Term Memory (BiLSTM) Network to perform deeper learning on the features of text, thereby improving the classification effect of the model. The experimental results show that our model is more effective than previous research reports on four text classification datasets. © 2022 ACM.
Keyword:
Reprint Author's Address:
Email:
Source :
Year: 2022
Page: 323-331
Language: English
Cited Count:
WoS CC Cited Count: 0
SCOPUS Cited Count: 9
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 6
Affiliated Colleges: