• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Xu, Shaowu (Xu, Shaowu.) | Wang, Luo (Wang, Luo.) | Jia, Xibin (Jia, Xibin.) (Scholars:贾熹滨)

Indexed by:

EI Scopus SCIE

Abstract:

Studies on graph contrastive learning, which is an effective way of self-supervision, have achieved excellent experimental performance. Most existing methods generate two augmented views, and then perform feature learning on the two views through maximizing semantic consistency. Nevertheless, it is still challenging to generate optimal views to facilitate the graph construction that can reveal the essential association relations among nodes by graph contrastive learning. Considering that the extremely high mutual information between views is prone to have a negative effect on model training, a good choice is to add constraints to the graph data augmentation process. This paper proposes two constraint principles, low dissimilarity priority (LDP) and mutual exclusion (ME), to mitigate the mutual information between views and compress redundant parts of mutual information between views. LDP principle aims to reduce the mutual information between views at global scale, and ME principle works to reduce the mutual information at local scale. They are opposite and appropriate in different situations. Without loss of generality, the two proposed principles are applied to two well-performed graph contrastive methods, i.e. GraphCL and GCA, and experimental results on 20 public benchmark datasets show that the models with the aid of the two proposed constraint principles achieve higher recognition accuracy.

Keyword:

Augmentation principle Self-supervised learning Graph data augmentation Contrastive learning Graph representation learning

Author Community:

  • [ 1 ] [Xu, Shaowu]Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
  • [ 2 ] [Wang, Luo]Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
  • [ 3 ] [Jia, Xibin]Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China

Reprint Author's Address:

Show more details

Related Keywords:

Source :

NEURAL PROCESSING LETTERS

ISSN: 1370-4621

Year: 2023

Issue: 8

Volume: 55

Page: 10705-10726

3 . 1 0 0

JCR@2022

ESI Discipline: COMPUTER SCIENCE;

ESI HC Threshold:19

Cited Count:

WoS CC Cited Count: 2

SCOPUS Cited Count: 2

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 14

Affiliated Colleges:

Online/Total:546/10585578
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.