• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Liu, Jiangbo (Liu, Jiangbo.) | He, Dongzhi (He, Dongzhi.)

Indexed by:

EI Scopus

Abstract:

In the traditional review text classification method, in order to realize the high accuracy of classification model, there are two basic premises: (1) training data and test data must be distributed independently and uniformly; (2) there must be enough training data to learn a good classification model. However, in many cases, these two premises are not true. If a classification model already exists and classifies data from a domain well, then a classification task for a related domain exists, but only data from the source domain, then it may violate this assumption. The comment text classification method based on transfer learning refers to applying the classification knowledge learned in the source domain to the new classification task in the relevant field by using the transfer learning method in the process of classifying the comment text. Therefore, after constructing the isomorphic feature space of source domain and target domain, the TrAdaBoost migration learning framework was used to train the classification model. This model allows users to leverage old data with a small amount of new markup data to build a high-quality classification model for new data. Experimental results show that the model can effectively transfer classification knowledge from source domain to target domain. © 2020 IEEE.

Keyword:

Text processing Classification (of information) Learning systems Transfer learning

Author Community:

  • [ 1 ] [Liu, Jiangbo]Faculty of Information Technology, Beijing University of Technology, Beijing, China
  • [ 2 ] [He, Dongzhi]Faculty of Information Technology, Beijing University of Technology, Beijing, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

Year: 2020

Page: 191-195

Language: English

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count: 3

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 10

Online/Total:245/10505370
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.