• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Zhang, Q. (Zhang, Q..) | Sun, Y. (Sun, Y..) | Hu, Y. (Hu, Y..) | Wang, S. (Wang, S..) | Yin, B. (Yin, B..)

Indexed by:

EI Scopus SCIE

Abstract:

Graph Convolutional Network (GCN) is a powerful model for graph representation learning. Since GCN updates nodes with a recursive neighbor aggregation scheme, training GCN on large-scale graphs suffers from enormous computational cost and large memory requirement. The subgraph sampling method trains GCN on sampled small-scale subgraphs to speed up GCN. However, they also suffer from problems, such as training GCN on unconnected and scale-unbalanced subgraphs, which reduce performance and efficiency. Moreover, existing subgraph sampling methods train GCN on subgraphs independently and ignore the relation information among different subgraphs. This paper proposes a novel subgraph sampling method, Improved Adaptive Neighbor Sampling (IANS), and a novel loss function, Subgraph Contrastive Loss. Subgraphs sampled by the IANS method are scale-balanced, inside nodes are significantly relevant, and the sample ratio controls the sparsity of subgraphs. To recover the lost relation information between different subgraphs, the Subgraph Contrastive Loss is defined, which constrains the initially connected nodes in different subgraphs to be closer and pushes unconnected nodes far away in feature space. A series of experiments are conducted, which train GCN with IANS and Subgraph Contrastive Loss for node classification on three different scale datasets. The training time and classification accuracy demonstrate the effectiveness of the proposed method. © 2023 Elsevier Inc.

Keyword:

Subgraph contrastive loss Large-scale graph Node classification Graph convolutional network Subgraph sampling

Author Community:

  • [ 1 ] [Zhang Q.]Beijing Key Laboratory of Multimedia and Intelligent Software Technology, Faculty of Information Technology, Beijing University of Technology, Beijing, 100124, China
  • [ 2 ] [Sun Y.]Beijing Key Laboratory of Multimedia and Intelligent Software Technology, Faculty of Information Technology, Beijing University of Technology, Beijing, 100124, China
  • [ 3 ] [Hu Y.]Beijing Key Laboratory of Multimedia and Intelligent Software Technology, Faculty of Information Technology, Beijing University of Technology, Beijing, 100124, China
  • [ 4 ] [Wang S.]Beijing Key Laboratory of Multimedia and Intelligent Software Technology, Faculty of Information Technology, Beijing University of Technology, Beijing, 100124, China
  • [ 5 ] [Yin B.]Beijing Key Laboratory of Multimedia and Intelligent Software Technology, Faculty of Information Technology, Beijing University of Technology, Beijing, 100124, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

Information Sciences

ISSN: 0020-0255

Year: 2023

Volume: 649

8 . 1 0 0

JCR@2022

ESI Discipline: COMPUTER SCIENCE;

ESI HC Threshold:19

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count: 3

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 4

Affiliated Colleges:

Online/Total:523/10585595
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.