Indexed by:
Abstract:
In the field of Natural Language Processing (NLP), traditional Chinese Named Entity Recognition (NER) tasks often only involve the recognition of a few types of entities. But current real-world applications require more fine-grained types of entities for more detailed downstream NLP tasks. Since fine-grained Chinese NER is challenging for existing models, novel models need to be introduced. We propose a model with RoBERTa as word embedding, a convolutional attention layer and CRF layer for outputting the entity labels. This model has a better overall performance than baseline models on the CLUENER2020 fine-grained Chinese NER dataset. © 2020 IEEE.
Keyword:
Reprint Author's Address:
Email:
Source :
Year: 2020
Page: 2104-2109
Language: English
Cited Count:
SCOPUS Cited Count: 4
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 7
Affiliated Colleges: