Indexed by:
Abstract:
Named Entity Recognition (NER) is a fundamental task in natural language processing. Syntax plays a significant role in helping to recognize the boundaries and types of entities. In comparison to English, Chinese NER, due to the absence of explicit delimiters, often faces challenges in determining entity boundaries. Similarly, syntactic parsing results can also lead to errors caused by wrong segmentation. In this paper, we propose the dual-grained syntax-aware Transformer network to mitigate the noise from single-grained syntactic parsing results by incorporating dual-grained syntactic information. Specifically, we first introduce syntax-aware Transformers to model dual-grained syntax-aware features and a contextual Transformer to model contextual features. We then design a triple feature aggregation module to dynamically fuse these features. We validate the effectiveness of our approach on three public datasets. © 2024 IEEE.
Keyword:
Reprint Author's Address:
Email:
Source :
ISSN: 1520-6149
Year: 2024
Page: 12717-12721
Language: English
Cited Count:
SCOPUS Cited Count: 6
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 10
Affiliated Colleges: