• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Liu, Jiangbo (Liu, Jiangbo.) | He, Dongzhi (He, Dongzhi.)

Indexed by:

CPCI-S

Abstract:

In the traditional review text classification method, in order to realize the high accuracy of classification model, there are two basic premises :(1) training data and test data must be distributed independently and uniformly; (2) there must be enough training data to learn a good classification model. However, in many cases, these two premises are not true. If a classification model already exists and classifies data from a domain well, then a classification task for a related domain exists, but only data from the source domain, then it may violate this assumption. The comment text classification method based on transfer learning refers to applying the classification knowledge learned in the source domain to the new classification task in the relevant field by using the transfer learning method in the process of classifying the comment text. Therefore, after constructing the isomorphic feature space of source domain and target domain, the TrAdaBoost migration learning framework was used to train the classification model. This model allows users to leverage old data with a small amount of new markup data to build a high-quality classification model for new data. Experimental results show that the model can effectively transfer classification knowledge from source domain to target domain.

Keyword:

text classification transfer learning

Author Community:

  • [ 1 ] [Liu, Jiangbo]Beijing Univ Technol, Fac Informat Technol, Beijing, Peoples R China
  • [ 2 ] [He, Dongzhi]Beijing Univ Technol, Fac Informat Technol, Beijing, Peoples R China

Reprint Author's Address:

  • [Liu, Jiangbo]Beijing Univ Technol, Fac Informat Technol, Beijing, Peoples R China

Show more details

Related Keywords:

Source :

PROCEEDINGS OF 2020 IEEE 5TH INFORMATION TECHNOLOGY AND MECHATRONICS ENGINEERING CONFERENCE (ITOEC 2020)

Year: 2020

Page: 191-195

Language: English

Cited Count:

WoS CC Cited Count: 3

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 4

Online/Total:341/10505110
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.