• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Zhang, Jinli (Zhang, Jinli.) | Jiang, Zongli (Jiang, Zongli.) | Li, Chen (Li, Chen.) | Wang, Zhenbo (Wang, Zhenbo.)

Indexed by:

EI

Abstract:

Attention-based models have attracted crazy enthusiasm both in natural language processing and graph processing. We propose a novel model called Graph Encoder Representations from Transformers (GERT). Inspired by the similar distribution between vertices in graphs and words in natural language, GERT utilizes the equivalent of sentences-vertices obtained from truncated random walks to learn the local information of vertices. Then, GERT combines the strengths of local information learned from random walks and long-distance dependence obtained from transformer encoder models to represent latent features. Compared to other transformer models, the advantages of GERT include extracting local and global information, being suitable for homogeneous and heterogeneous networks, and possessing stronger strengths in extracting latent features. On top of GERT, we integrate convolution to extract information from the local neighbors and obtain another novel model Graph Convolution Encoder Representations from Transformers (GCERT). We demonstrate the effectiveness of proposed models on six networks DBLP, BlogCatalog, CiteSeerX, CoRE, Flickr, and PubMed. Evaluation results show that our models improve F1 scores of current state-of-the-art methods up to 10 %. © The Author(s), under exclusive license to Springer Nature Switzerland AG 2023.

Keyword:

Convolution Graph neural networks Network embeddings Network coding Random processes Heterogeneous networks Supervised learning Natural language processing systems

Author Community:

  • [ 1 ] [Zhang, Jinli]Beijing University of Technology, Pingleyuan Street, Beijing, China
  • [ 2 ] [Jiang, Zongli]Beijing University of Technology, Pingleyuan Street, Beijing, China
  • [ 3 ] [Li, Chen]Graduate School of Informatics, Nagoya University, Chikusa, Nagoya; 464-8602, Japan
  • [ 4 ] [Wang, Zhenbo]Beijing University of Technology, Pingleyuan Street, Beijing, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

ISSN: 0302-9743

Year: 2023

Volume: 14178 LNAI

Page: 199-213

Language: English

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 10

Affiliated Colleges:

Online/Total:988/10658952
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.