• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Lin, Shaofu (Lin, Shaofu.) | Wang, Mengzhen (Wang, Mengzhen.) | Shi, Chengyu (Shi, Chengyu.) | Xu, Zhe (Xu, Zhe.) | Chen, Lihong (Chen, Lihong.) | Gao, Qingcai (Gao, Qingcai.) | Chen, Jianhui (Chen, Jianhui.)

Indexed by:

EI Scopus SCIE

Abstract:

Background: Medication recommendation based on electronic medical record (EMR) is a research hot spot in smart healthcare. For developing computational medication recommendation methods based on EMR, an important challenge is the lack of a large number of longitudinal EMR data with time correlation. Faced with this challenge, this paper proposes a new EMR-based medication recommendation model called MR-KPA, which combines knowledge-enhanced pre-training with the deep adversarial network to improve medication recommendation from both feature representation and the fine-tuning process. Firstly, a knowledge-enhanced pre-training visit model is proposed to realize domain knowledge-based external feature fusion and pre-training-based internal feature mining for improving the feature representation. Secondly, a medication recommendation model based on the deep adversarial network is developed to optimize the fine-tuning process of pre-training visit model and alleviate over-fitting of model caused by the task gap between pre-training and recommendation. Result: The experimental results on EMRs from medical and health institutions in Hainan Province, China show that the proposed MR-KPA model can effectively improve the accuracy of medication recommendation on small-scale longitudinal EMR data compared with existing representative methods. Conclusion: The advantages of the proposed MR-KPA are mainly attributed to knowledge enhancement based on ontology embedding, the pre-training visit model and adversarial training. Each of these three optimizations is very effective for improving the capability of medication recommendation on small-scale longitudinal EMR data, and the pre-training visit model has the most significant improvement effect. These three optimizations are also complementary, and their integration makes the proposed MR-KPA model achieve the best recommendation effect.

Keyword:

Medication recommendation Adversarial training Pre-training model Graph attention network Electronic medical record

Author Community:

  • [ 1 ] [Lin, Shaofu]Beijing Univ Technol, Fac Informat Technol, Beijing, Peoples R China
  • [ 2 ] [Wang, Mengzhen]Beijing Univ Technol, Fac Informat Technol, Beijing, Peoples R China
  • [ 3 ] [Shi, Chengyu]Beijing Univ Technol, Fac Informat Technol, Beijing, Peoples R China
  • [ 4 ] [Xu, Zhe]Beijing Univ Technol, Fac Informat Technol, Beijing, Peoples R China
  • [ 5 ] [Chen, Lihong]Beijing Univ Technol, Fac Informat Technol, Beijing, Peoples R China
  • [ 6 ] [Gao, Qingcai]Beijing Univ Technol, Fac Informat Technol, Beijing, Peoples R China
  • [ 7 ] [Chen, Jianhui]Beijing Univ Technol, Fac Informat Technol, Beijing, Peoples R China
  • [ 8 ] [Chen, Lihong]Beijing Univ Technol, Beijing Int Collaborat Base Brain Informat & Wisdo, Beijing, Peoples R China
  • [ 9 ] [Gao, Qingcai]Beijing Univ Technol, Beijing Int Collaborat Base Brain Informat & Wisdo, Beijing, Peoples R China
  • [ 10 ] [Chen, Jianhui]Beijing Univ Technol, Beijing Int Collaborat Base Brain Informat & Wisdo, Beijing, Peoples R China
  • [ 11 ] [Chen, Jianhui]Beijing Univ Technol, Beijing Key Lab MRI & Brain Informat, Beijing, Peoples R China
  • [ 12 ] [Chen, Jianhui]Minist Educ, Engn Res Ctr Intelligent Percept & Autonomous Cont, Beijing, Peoples R China
  • [ 13 ] [Chen, Jianhui]Minist Educ, Engn Res Ctr Digital Community, Beijing, Peoples R China

Reprint Author's Address:

Show more details

Related Keywords:

Source :

BMC BIOINFORMATICS

ISSN: 1471-2105

Year: 2022

Issue: 1

Volume: 23

3 . 0

JCR@2022

3 . 0 0 0

JCR@2022

ESI Discipline: COMPUTER SCIENCE;

ESI HC Threshold:46

JCR Journal Grade:2

CAS Journal Grade:3

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count: 2

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 12

Affiliated Colleges:

Online/Total:685/10627139
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.