Indexed by:
Abstract:
Graph neural networks (GNNs) exhibit a robust capability for representation learning on graphs with complex structures, demonstrating superior performance across various applications. Most existing GNNs utilize graph convolution operations that integrate both attribute and structural information through coupled way. And these GNNs, from an optimization perspective, seek to learn a consensus and compromised embedding representation that balances attribute and graph information, selectively exploring and retaining valid information in essence. To obtain a more comprehensive embedding representation, a novel GNN framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced. DGNN separately explores distinctive embedding representations from the attribute and graph spaces by decoupled terms. Considering that the semantic graph, derived from attribute feature space, contains different node connection information and provides enhancement for the topological graph, both topological and semantic graphs are integrated by DGNN for powerful embedding representation learning. Further, structural consistency between the attribute embedding and the graph embedding is promoted to effectively eliminate redundant information and establish soft connection. This process involves facilitating factor sharing for adjacency matrices reconstruction, which aims at exploring consensus and high-level correlations. Finally, a more powerful and comprehensive representation is achieved through the concatenation of these embeddings. Experimental results conducted on several graph benchmark datasets demonstrate its superiority in node classification tasks. © 2015 IEEE.
Keyword:
Reprint Author's Address:
Email:
Source :
IEEE Transactions on Big Data
ISSN: 2332-7790
Year: 2024
7 . 2 0 0
JCR@2022
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 10
Affiliated Colleges: