• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Gao, Shengxin (Gao, Shengxin.) | Du, Jinlian (Du, Jinlian.) | Zhang, Xiao (Zhang, Xiao.)

Indexed by:

EI Scopus

Abstract:

Relation extraction is a necessary step in obtaining information from electronic medical records. The deep learning methods for relation extraction are primarily based on word2vec and convolutional or recurrent neural network. However, word vectors generated by word2vec are static and cannot well reflect the different meanings of polysemy in different contexts and the feature extraction ability of RNN (Recurrent Neural Network) is not good enough. At the same time, the BERT (Bidirectional Encoder Representations from Transformers) pre-trained language model has achieved excellent results in many natural language processing tasks. In this paper, we propose a medical relation extraction model based on BERT. We combine the information of the whole sentence obtained from the pre-train language model with the corresponding information of two medical entities to complete relation extraction task. The experimental data were obtained from the Chinese electronic medical records provided by a hospital in Beijing. Experimental results on electronic medical records show that our model's accuracy, precision, recall, and F1-score reach 67.37%, 69.54%, 67.38%, 68.44%, which are higher than other three methods. Because named entity recognition task is the premise of relation extraction, we will combine the model with named entity recognition in the future work. © 2020 ACM.

Keyword:

Medical computing Extraction Natural language processing systems Recurrent neural networks Computational linguistics Learning systems

Author Community:

  • [ 1 ] [Gao, Shengxin]Information Department, Beijing University of Technology, China
  • [ 2 ] [Du, Jinlian]Information Department, Beijing University of Technology, China
  • [ 3 ] [Zhang, Xiao]Information Department, Beijing University of Technology, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

Year: 2020

Page: 487-490

Language: English

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count: 5

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 14

Affiliated Colleges:

Online/Total:803/10680185
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.