• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Yang, Haoran (Yang, Haoran.) | Wang, Yan (Wang, Yan.) | Li, Piji (Li, Piji.) | Bi, Wei (Bi, Wei.) | Lam, Wai (Lam, Wai.) | Xu, Chen (Xu, Chen.)

Indexed by:

CPCI-S

Abstract:

Commonsense generation aims to generate a plausible sentence containing all given unordered concept words. Previous methods focusing on this task usually directly concatenate these words as the input of a pre-trained language model (PLM). However, in PLMs' pre-training process, the inputs are often corrupted sentences with correct word order. This input distribution discrepancy between pre-training and fine-tuning makes the model difficult to fully utilize the knowledge of PLMs. In this paper, we propose a two-stage framework to alleviate this issue. Firstly, in pre-training stage, we design a new format of input to endow PLMs the ability to deal with masked sentences with incorrect word order. Secondly, during fine-tuning, we insert the special token [MASK] between two consecutive concept words to make the input distribution more similar to the input distribution in pre-training. We conduct extensive experiments and provide a thorough analysis to demonstrate the effectiveness of our proposed method. The code is available at https://github.com/LHRYANG/CommonGen.

Keyword:

Author Community:

  • [ 1 ] [Yang, Haoran]Chinese Univ Hong Kong, Hong Kong, Peoples R China
  • [ 2 ] [Lam, Wai]Chinese Univ Hong Kong, Hong Kong, Peoples R China
  • [ 3 ] [Wang, Yan]Tencent AI Lab, Shenzhen, Peoples R China
  • [ 4 ] [Li, Piji]Tencent AI Lab, Shenzhen, Peoples R China
  • [ 5 ] [Bi, Wei]Tencent AI Lab, Shenzhen, Peoples R China
  • [ 6 ] [Xu, Chen]Beijing Univ Technol, Beijing, Peoples R China

Reprint Author's Address:

Show more details

Related Keywords:

Related Article:

Source :

17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023

Year: 2023

Page: 376-383

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 7

Affiliated Colleges:

Online/Total:881/10660316
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.