• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Zhao, Jianyu (Zhao, Jianyu.) | Zhan, Zhiqiang (Zhan, Zhiqiang.) | Li, Tong (Li, Tong.) | Li, Rang (Li, Rang.) | Hu, Changjian (Hu, Changjian.) | Wang, Siyun (Wang, Siyun.) | Zhang, Yang (Zhang, Yang.)

Indexed by:

EI Scopus SCIE

Abstract:

Table-to-Text generation aims to generate descriptions which can be viewed as a set of field-value records for factual tables. Despite the significant progress, the state-of-the-art models suffer from two major issues: Nonfluency and Divergence. Nonfluency means descriptions generated by models are not as fluent as those generated by humans, and thus can be distinguished easily. Divergence refers to the fact that the generated sentences contain information which can not be concluded from factual tables. This could be attributed to that most neural models are trained with the Maximum Likelihood Estimation (MLE) loss and use divergence-contained references as the ground truth, which forces the models to learn what cannot be inferred from the source to some extent. Motivated by the limitations of current models, we propose a novel GAN-based model with adversarial learning mechanism, which simultaneously trains a generative model G and a discriminative model D, to address Nonfluency and Divergence issues in Table-to-Text generation. Specifically, we build the generator G as an agent of reinforcement learning with a sequence-to-sequence architecture, which takes the raw data as input and predicts the generated sentences. Meanwhile, we build the discriminator D with a Convolutional Neural Network (CNN) to calculate rewards to measure the fluency of generations. To judge the fidelity of generations with regard to the original table more accurately, we also calculate the rewards from BLEU Table. With the fusion rewards from CNN and BLEU-Table, our methods outperform the baselines by a large margin on the WikiBio and Wiki3C benchmarks evaluated with BLEU, ROURGE, and PARENT. Specifically, our models achieve 49.0 (BLEU-4), 37.8 (ROUGE-4) and 45.4 (PARENT) on WikiBio, as well as 12.9 (BLEU-4) and 6.9 (ROUGE-4) on Wiki3C. More importantly, we construct a new Wiki3C dataset that improves the insufficiency of datasets and promote the progress in Table-to-Text generation. (c) 2021 Elsevier B.V. All rights reserved.

Keyword:

Table-to-Text generation Natural language generation Generative adversarial network

Author Community:

  • [ 1 ] [Zhao, Jianyu]Lenovo Res, AI Lab, Beijing, Peoples R China
  • [ 2 ] [Li, Rang]Lenovo Res, AI Lab, Beijing, Peoples R China
  • [ 3 ] [Hu, Changjian]Lenovo Res, AI Lab, Beijing, Peoples R China
  • [ 4 ] [Zhan, Zhiqiang]Lenovo Res, Smart Educ Lab, Beijing, Peoples R China
  • [ 5 ] [Zhang, Yang]Lenovo Res, Smart Educ Lab, Beijing, Peoples R China
  • [ 6 ] [Li, Tong]Beijing Univ Technol, Comp Sci, Beijing, Peoples R China
  • [ 7 ] [Wang, Siyun]Univ Southern Calif, Los Angeles, CA 90089 USA

Reprint Author's Address:

  • [Zhang, Yang]Lenovo Res, Smart Educ Lab, Beijing, Peoples R China

Show more details

Related Keywords:

Source :

NEUROCOMPUTING

ISSN: 0925-2312

Year: 2021

Volume: 452

Page: 28-36

6 . 0 0 0

JCR@2022

ESI Discipline: COMPUTER SCIENCE;

ESI HC Threshold:87

JCR Journal Grade:2

Cited Count:

WoS CC Cited Count: 9

SCOPUS Cited Count: 9

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 4

Online/Total:686/10708895
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.