Indexed by:
Abstract:
Text classification is an important task in natural language processing, where text is classified into different categories by labelling. However, due to the complex structure and deep semantics of natural language, it is still challenging to deeply analyze the intrinsic structure of text and extract text features. In this paper, we adopt a parallel combination of local and global features to extract potential semantic information from text data. Among them, global features are obtained by the Context-BiLSTM method. The method fuses the hidden states in both the forward and backward directions through a gate mechanism before entering the cell, instead of simply summing the forward and backward hidden states at the end. The local features are obtained through the Pooling Word Attention CNN(PWACNN) model. This method generates the pooling results with embedding participation, so the pooling results can focus on its relevant features in the original embedding. Finally, the PWACNN and Context-BiLSTM fusion model into the softmax layer for classification. Experimental results show that the proposed model performs well and can improve the accuracy of classification significantly. ©2023 IEEE.
Keyword:
Reprint Author's Address:
Email:
Source :
Year: 2023
Page: 18-24
Language: English
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 2
Affiliated Colleges: