• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Yang, H. (Yang, H..) | Wang, Y. (Wang, Y..) | Li, P. (Li, P..) | Bi, W. (Bi, W..) | Lam, W. (Lam, W..) | Xu, C. (Xu, C..)

Indexed by:

Scopus

Abstract:

Commonsense generation aims to generate a plausible sentence containing all given unordered concept words. Previous methods focusing on this task usually directly concatenate these words as the input of a pre-trained language model (PLM). However, in PLMs’ pretraining process, the inputs are often corrupted sentences with correct word order. This input distribution discrepancy between pre-training and fine-tuning makes the model difficult to fully utilize the knowledge of PLMs. In this paper, we propose a two-stage framework to alleviate this issue. Firstly, in pre-training stage, we design a new format of input to endow PLMs the ability to deal with masked sentences with incorrect word order. Secondly, during fine-tuning, we insert the special token [MASK] between two consecutive concept words to make the input distribution more similar to the input distribution in pre-training. We conduct extensive experiments and provide a thorough analysis to demonstrate the effectiveness of our proposed method. The code is available at https://github.com/LHRYANG/CommonGen. © 2023 Association for Computational Linguistics.

Keyword:

Author Community:

  • [ 1 ] [Yang H.]The Chinese University of Hong Kong, Hong Kong
  • [ 2 ] [Wang Y.]Tencent AI Lab., United States
  • [ 3 ] [Li P.]Tencent AI Lab., United States
  • [ 4 ] [Bi W.]Tencent AI Lab., United States
  • [ 5 ] [Lam W.]The Chinese University of Hong Kong, Hong Kong
  • [ 6 ] [Xu C.]Beijing University of Technology, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

Year: 2023

Page: 376-383

Language: English

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 8

Affiliated Colleges:

Online/Total:420/10586554
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.