Indexed by:
Abstract:
Collaborative filtering usually suffers from limited performance due to a data sparsity problem. Transfer learning presents an unprecedented opportunity to alleviate this issue through the transfer of useful knowledge from an auxiliary domain to a target domain. However, the situation becomes complicated when the source and target domain share partial knowledge with each other. Transferring the unshared part across domains will cause negative transfer and may degrade the prediction accuracy in the target domain. To address this issue, in this paper, we present a novel model that exploits the latent factors in the target domain against the negative transfer. First, we transfer rating patterns from the source domain to approximate and reconstruct the target rating matrix. Second, to be specific, we propose a partial-adaptation nonnegative matrix factorization method to correct the transfer learning result and restore latent factors in the target. The final experiments completed on real world datasets demonstrate that our proposed approach effectively addresses the negative transfer and significantly outperforms the state-of-art transfer-learning model.
Keyword:
Reprint Author's Address:
Email:
Source :
CCF TRANSACTIONS ON PERVASIVE COMPUTING AND INTERACTION
ISSN: 2524-521X
Year: 2020
Issue: 1
Volume: 2
Page: 42-50
Cited Count:
SCOPUS Cited Count: 1
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 5
Affiliated Colleges: