• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Qu, Zhijie (Qu, Zhijie.) | Li, Juan (Li, Juan.) | Ma, Zerui (Ma, Zerui.) | Li, Jianqiang (Li, Jianqiang.)

Indexed by:

CPCI-S EI Scopus

Abstract:

Medical dialogue generation relies on natural language generation techniques to enable online medical consultations. Recently, the widespread adoption of large-scale models in the field of natural language processing has facilitated rapid advancements in this technology. Existing medical dialogue models are mostly based on BERT and pre-trained on English corpora, but there is a lack of high-performing models on the task of Chinese medical dialogue generation. To solve the above problem, this paper proposes CMed-GPT, which is the GPT pre-training language model based on Chinese medical domain text. The model is available in two versions, namely, base and large, with corresponding perplexity values of 8.64 and 8.01. Additionally, we incorporate lexical and entity embeddings into the dialogue text in a uniform manner to meet the requirements of downstream dialogue generation tasks. By applying both fine-tuning and p-tuning to CMed-GPT, we lowered the PPL from 8.44 to 7.35. This study not only confirms the exceptional performance of the CMed-GPT model in generating Chinese biomedical text but also highlights the advantages of p-tuning over traditional fine-tuning with prefix prompts. Furthermore, we validate the significance of incorporating external information inmedical dialogue generation, which enhances the quality of dialogue generation.

Keyword:

Chinese medical dialogue P-tuning Pre-trained language model

Author Community:

  • [ 1 ] [Qu, Zhijie]Beijing Univ Technol, Beijing 100124, Peoples R China
  • [ 2 ] [Li, Juan]Beijing Univ Technol, Beijing 100124, Peoples R China
  • [ 3 ] [Ma, Zerui]Beijing Univ Technol, Beijing 100124, Peoples R China
  • [ 4 ] [Li, Jianqiang]Beijing Univ Technol, Beijing 100124, Peoples R China

Reprint Author's Address:

  • [Qu, Zhijie]Beijing Univ Technol, Beijing 100124, Peoples R China;;

Show more details

Related Keywords:

Source :

ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PT II, PAKDD 2024

ISSN: 2945-9133

Year: 2024

Volume: 14646

Page: 81-92

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 1

Affiliated Colleges:

Online/Total:2370/10975051
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.