• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Yin, Jun (Yin, Jun.) | Zhu, Cui (Zhu, Cui.) | Zhu, Wenjun (Zhu, Wenjun.)

Indexed by:

EI Scopus

Abstract:

Transformer has good feature extraction ability and has achieved good performance on various NLP tasks such as sentence classification, machine translation and reading comprehension, but it does not perform well in named entity recognition tasks. According to recent researches, the Long Short-Term Memory (LSTM) usually performs better than Transformer in NER task. LSTM is a variant of Recurrent Neural Network (RNN), because of its natural chain structure, it can learn the front and back dependencies between words well, which is very suitable for processing text sequences. In this paper, the BiLSTM network structure is embedded into the Transformer Encoder, and a new network structure BiLSTM-IN-TRANS is proposed, which combines the sequential feature extraction capability of BiLSTM and the powerful global feature extraction capability of Transformer Encoder. The experiments can reflect that applying the model based on BiLSTM-INTRANS could work better than applying one of LSTM or Transformer alone in the NER task. © 2022 SPIE.

Keyword:

Signal encoding Extraction Feature extraction Long short-term memory Natural language processing systems

Author Community:

  • [ 1 ] [Yin, Jun]Beijing University of Technology, Faculty of Information Technology, Chaoyang District, Beijing, China
  • [ 2 ] [Zhu, Cui]Beijing University of Technology, Faculty of Information Technology, Chaoyang District, Beijing, China
  • [ 3 ] [Zhu, Wenjun]Beijing University of Technology, Faculty of Information Technology, Chaoyang District, Beijing, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

ISSN: 0277-786X

Year: 2022

Volume: 12305

Language: English

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 15

Affiliated Colleges:

Online/Total:598/10598589
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.