• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Yuan, YingQi (Yuan, YingQi.)

Indexed by:

EI Scopus

Abstract:

BiLSTM has been widely used in the field of text classification, but the model still cannot accurately measure the importance of each word and cannot extract text features more effectively. In order to solve this problem, this paper proposes a BiLSTM-WSAttention model. The neural network model of BiLSTM-WSAttention is used to classify text. The BiLSTM-WSAttention neural network model combines the context of words and sentences, and extracts contextual semantic information from two perspectives: front to back and back to front. At the same time, this article introduces an attention mechanism. Since text is composed of sentences, sentences are composed of words, and the importance of words and sentences depends on context information, so this article includes word-level attention mechanisms and sentence-level. The attention mechanism assigns different weight values to different words and sentences. Finally, the method proposed in this article is compared with the classification methods of Naive-Bayes, CNN, RNN, and BLSTM on the same data set. The experimental results show that: Compared with other classification methods, the neural network model BiLSTM-WSAttention proposed in this article is effective on this data set. © 2021 IEEE.

Keyword:

Classification (of information) Recurrent neural networks Semantics Text processing

Author Community:

  • [ 1 ] [Yuan, YingQi]Beijing University of Technology, Faculty of Information Technology, Beijing, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

ISSN: 2689-6621

Year: 2021

Page: 2235-2239

Language: English

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count: 6

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 5

Online/Total:645/10645166
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.