• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Ji, Z. (Ji, Z..) | Xiao, Y. (Xiao, Y..)

Indexed by:

CPCI-S EI Scopus

Abstract:

The Flat-LAttice Transformer (FLAT) has achieved notable success in Chinese named entity recognition (NER) by integrating lexical information into the widely-used Transformer encoder. FLAT enhances each sentence by constructing a flat lattice, a token sequence with characters and matched lexicon words, and calculating self-attention among tokens. However, FLAT faces a quadruple complexity challenge, especially with lengthy sentences containing numerous matched words, significantly increasing memory and computational costs. To alleviate this issue, we propose a novel lightweight lexicon-enhanced Transformer (LLET) for Chinese NER. Specifically, we introduce two distinct variants that focus on character attention to characters and words, both jointly and separately. Experimental results conducted on four public Chinese NER datasets show that both variants achieve significant memory savings while maintaining comparable performance when compared to FLAT. © 2024 IEEE.

Keyword:

Lightweight Chinese NER Transformer Lexicon-Enhanced Transformer

Author Community:

  • [ 1 ] [Ji Z.]PAII Inc, California, United States
  • [ 2 ] [Xiao Y.]Beijing University of Technology, Beijing, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Source :

ISSN: 1520-6149

Year: 2024

Page: 12677-12681

Language: English

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count: 6

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 2

Affiliated Colleges:

Online/Total:789/10529164
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.