• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Zhang, L. (Zhang, L..) | Li, Z. (Li, Z..) | Guo, T. (Guo, T..) | Huang, P. (Huang, P..)

Indexed by:

EI Scopus

Abstract:

For the current research on aspectual emotion analytic problems in the task of low degree of text utilisation, insufficient extraction of semantic and emotional features from commented text, and inefficient recognition of polysemous words, We propose a dual-channel mixed neurological network based on the Bert's sentimental analysis model (BDCM) for this paper. the BERT model performs text vectorisation; the dual-channel hybrid neural network model is constructed by BiLSTM channel and BiGRU channel to obtain the full and partial traits information of the selected texts. Finally, the features output from the dual-channel are horizontally fused for the emotion classification. results of the SemEval 2014 dataset shows that BERT is more capable of extracting semantic features from the text. Contrast to this with the primitive model, fusion of various attentional mechanisms in dual-channel hybrid neural network model is more capable of extracting features than the simple one and the performance of emotion classification significantly improves, which can lead to better performance in the domain of emotion classification. © 2024 IEEE.

Keyword:

BiLSTM Attention mechanism Aspect Emotional Classification Hybrid model BiGRU Dual-Channel

Author Community:

  • [ 1 ] [Zhang L.]Beijing University of Technology, Faculty of Informatics, School of Software, Beijing, China
  • [ 2 ] [Li Z.]Beijing University of Technology, Faculty of Informatics, School of Software, Beijing, China
  • [ 3 ] [Guo T.]Beijing University of Technology, Faculty of Informatics, School of Software, Beijing, China
  • [ 4 ] [Huang P.]Beijing University of Technology, Faculty of Informatics, School of Software, Beijing, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Source :

ISSN: 2689-6621

Year: 2024

Page: 258-261

Language: English

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 2

Affiliated Colleges:

Online/Total:2663/10655286
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.