• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

He, Ming (He, Ming.) | Ding, Tianyu (Ding, Tianyu.) | Han, Tianshuo (Han, Tianshuo.)

Indexed by:

CPCI-S EI Scopus

Abstract:

Graph Convolutional Networks (GCNs) have recently achiev-ed impressive performance in different classification tasks. However, over-smoothing remains a fundamental burden to achieve deep GCNs for node classification. This paper proposes Structure-Aware Deep Graph Convolutional Networks (SAGCN), a novel model to overcome this burden. At its core, SAGCN separates the initial node features from propagation and directly maps them to the output at each layer. Furthermore, SAGCN selectively aggregates the information from different propagation layers to generate structure-aware node representations, where the attention mechanism is exploited to adaptively balance the information from local and global neighborhoods for each node. Our experiments verify that the SAGCN model achieves state-of-the-art performance in various semi-supervised and full-supervised node classification tasks. More importantly, it outperforms many other backbone models, by using half the number of layers, or even fewer layers.

Keyword:

Deep learning Attention mechanism Node classification Graph Convolutional Networks

Author Community:

  • [ 1 ] [He, Ming]Beijing Univ Technol, Fac Informat Technol, Beijing, Peoples R China
  • [ 2 ] [Ding, Tianyu]Beijing Univ Technol, Fac Informat Technol, Beijing, Peoples R China
  • [ 3 ] [Han, Tianshuo]Beijing Univ Technol, Fac Informat Technol, Beijing, Peoples R China

Reprint Author's Address:

Show more details

Related Keywords:

Related Article:

Source :

ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2021, PT II

ISSN: 0302-9743

Year: 2021

Volume: 12713

Page: 67-78

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 6

Affiliated Colleges:

Online/Total:257/10626204
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.