• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Tian, T. (Tian, T..) | He, X. (He, X..) | Wang, B. (Wang, B..) | Li, X. (Li, X..) | Hu, Y. (Hu, Y..)

Indexed by:

EI Scopus

Abstract:

Graph clustering aims to divide the graph nodes into distinct clusters in an unsupervised manner, which usually encodes the data features and the corresponding graph structure into a latent feature space. However, we observe that the node representations learned by existing graph embedding methods are always of poor quality and may contain noise, e.g., redundant information or deficient information, which significantly affects the clustering performance. To solve this problem, we propose a novel Information Correlation Co-Supervision based Graph Convolutional Clustering Model. This model consists of a hierarchical deep network architecture, i.e. graph auto-encoder and auto-encoder. Specifically, the graph convolutional auto-encoder (GCAE) is utilized to integrate the graph structure information into the latent feature space, and the auto-encoder (AE) is proposed to learn node embedding. In order to make the node representations more discriminate and more robust against noise, the information correlation co-supervision loss is designed to supervise the correlation of node representations learned by AE and GCAE. Extensive experiments on three public datasets demonstrate that our proposed model performs better than the multiple state-of-the-art deep clustering method. © 2023 Technical Committee on Control Theory, Chinese Association of Automation.

Keyword:

Clustering Graph convolutional network Representation learning

Author Community:

  • [ 1 ] [Tian T.]Beijing Key Laboratory of Multimedia and Intelligent Software Technology, Beijing, 100124, China
  • [ 2 ] [Tian T.]Beijing Artificial Intelligence Institute, Beijing University of Technology, Faculty of Information Technology, Beijing, 100124, China
  • [ 3 ] [He X.]Beijing Key Laboratory of Multimedia and Intelligent Software Technology, Beijing, 100124, China
  • [ 4 ] [He X.]Beijing Artificial Intelligence Institute, Beijing University of Technology, Faculty of Information Technology, Beijing, 100124, China
  • [ 5 ] [Wang B.]Beijing Key Laboratory of Multimedia and Intelligent Software Technology, Beijing, 100124, China
  • [ 6 ] [Wang B.]Beijing Artificial Intelligence Institute, Beijing University of Technology, Faculty of Information Technology, Beijing, 100124, China
  • [ 7 ] [Li X.]Beijing Key Laboratory of Multimedia and Intelligent Software Technology, Beijing, 100124, China
  • [ 8 ] [Li X.]Beijing Artificial Intelligence Institute, Beijing University of Technology, Faculty of Information Technology, Beijing, 100124, China
  • [ 9 ] [Hu Y.]Beijing Key Laboratory of Multimedia and Intelligent Software Technology, Beijing, 100124, China
  • [ 10 ] [Hu Y.]Beijing Artificial Intelligence Institute, Beijing University of Technology, Faculty of Information Technology, Beijing, 100124, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Source :

ISSN: 1934-1768

Year: 2023

Volume: 2023-July

Page: 7657-7662

Language: English

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 2

Affiliated Colleges:

Online/Total:455/10650472
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.