Indexed by:
Abstract:
Previous works build interest learning via mining deeply on interactions. However, the interactions come incomplete and insufficient to support interest modeling, even bringing severe bias into recommendations. To address the interaction sparsity and the consequent bias challenges, we propose a graph contrastive learning on complementary embedding (GCCE), which introduces negative interests to assist positive interests of interactions for interest modeling. To embed interest, we design a perturbed graph convolution by preventing embedding distribution from bias. Since negative samples are not available in the general scenario of implicit feedback, we elaborate a complementary embedding generation to depict users' negative interests. Finally, we develop a new contrastive task to contrastively learn from the positive and negative interests to promote recommendation. We validate the effectiveness of GCCE on two real datasets, where it outperforms the state-of-the-art models for recommendation.
Keyword:
Reprint Author's Address:
Email:
Source :
PROCEEDINGS OF THE 2023 ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA RETRIEVAL, ICMR 2023
Year: 2023
Page: 576-580
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 2
Affiliated Colleges: