• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Wu, D. (Wu, D..) | Zheng, S. (Zheng, S..) | Wang, Q. (Wang, Q..)

Indexed by:

EI Scopus

Abstract:

Prompt-based pre-trained language models (PLMs) have demonstrated their superior performance on a wide variety of downstream tasks. In particular, the performance of prompt tuning has significantly outperformed traditional fine-tuning in the zero-shot learning and few-shot learning scenarios. The core idea of prompt-tuning is to convert different downstream tasks to mask language modeling problems through prompts, which can bridge the gap between pre-training tasks and downstream tasks for better results. Verbalizer, as an important part of prompt-tuning, can largely determine the final performance of the model, but the design of the Chinese-based verbalizer is yet to be fully explored. In this paper, we propose a method to expand the verbalizer by extracting knowledge from the training set based on a Chinese text classification task. In brief, we first segment the Chinese training set, then filter the words that can express the semantics of the labels by semantic similarity, and finally add them to the verbalizer. Extensive experimental results on multiple text classification datasets show that our approach significantly outperforms ordinary prompt-tuning and outperforms other methods for constructing the verbalizer. © 2023 SPIE.

Keyword:

zero-shot learning few-shot learning text classification Chinese prompt

Author Community:

  • [ 1 ] [Wu D.]School of Computer Science, Beijing University of Technology, Beijing, China
  • [ 2 ] [Zheng S.]School of Computer Science, Beijing University of Technology, Beijing, China
  • [ 3 ] [Wang Q.]School of Computer Science, Beijing University of Technology, Beijing, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

ISSN: 0277-786X

Year: 2023

Volume: 12645

Language: English

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 2

Affiliated Colleges:

Online/Total:564/10713650
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.