• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Gu, T. (Gu, T..) | Zhu, Z. (Zhu, Z..) | Zhao, Q. (Zhao, Q..)

Indexed by:

CPCI-S EI Scopus

Abstract:

Medical Named Entity Recognition is essential for structuring medical text data, thereby aiding in the creation of medical applications like knowledge graphs and diagnostic systems. Contemporary methods predominantly utilize models of word embedding, subsequently augmented by various models for semantic comprehension, to enhance entity recognition performance. However, in the medical domain, specialized terminologies pose a challenge for general domain word embedding models. Furthermore, current methods frequently neglect local semantic attributes and face challenges in fully grasping global semantic features, attributed to the dimensional constraints of word embeddings. To address these challenges, we propose the Stacked Attention Network (SAN) for Chinese medical NER. We fine-tune RoBERTa using real-world electronic medical record data to incorporate medical term features and utilize a CNN model to extract local semantic features. Furthermore, we introduce a stacked BiLSTM with a multi-layer structure to effectively capture global semantic information. Experimental results on real-world medical text data demonstrate that our SAN model achieves an F1 score of 91.5%, outperforming other advanced NER models.  © 2024 IEEE.

Keyword:

Stacked BiLSTM Attention Medical named entity recognition RoBERTa

Author Community:

  • [ 1 ] [Gu T.]Beijing University of Technology, Beijing, China
  • [ 2 ] [Zhu Z.]Beijing University of Technology, Beijing, China
  • [ 3 ] [Zhao Q.]Beijing University of Technology, Beijing, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Source :

Year: 2024

Page: 2032-2037

Language: English

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 6

Affiliated Colleges:

Online/Total:841/10666820
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.