• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Zhu, Ning (Zhu, Ning.) | Wang, Shaofan (Wang, Shaofan.) | Sun, Yanfeng (Sun, Yanfeng.) | Yin, Baocai (Yin, Baocai.) (Scholars:尹宝才)

Indexed by:

SCIE

Abstract:

The graph few-shot class incremental learning (GFSCIL) task incrementally acquires new knowledge from non-stationary data streams with limited training samples. It encompasses two distinct phenomena: overfitting, which occurs when the network excessively aligns with the characteristics of few-shot samples and fails to capture the underlying class patterns; and catastrophic forgetting, where the introduction of new information gradually interferes with previously learned knowledge in the network. The prototype-based methods tackle those challenges by employing clustering in the metric space to acquire class prototypes, which differ from traditional class incremental learning methods in modifying model parameters or inserting regular term constraints. We propose the uncertainty-guided recurrent prototype distillation network (URPD) for GFSCIL to address the aforementioned challenges. URPD comprises two key components: a recurrent prototype representation (RPR) module and a generated distillation (GD) module. The RPR module tackles the overfitting issue by generating recurrent class prototypes based on an uncertainty selection scheme applied to unlabeled nodes. The GD module mitigates the catastrophic forgetting issue by using a generative distillation scheme, which distills old knowledge based on not only current nodes but also generated replay nodes. Essentially, URPD improves traditional prototype-based methods by learning debiased class prototypes with more fruitful knowledge induced from unlabeled nodes. Experimental results demonstrate that URPD outperforms current state-of-the-art methods by a margin ranging from 0.95 to 6.46% on the Cora-Full, Cora-ML, Flickr, and Amazon datasets.

Keyword:

Class-incremental learning Few-shot learning Node classification Graph neural network

Author Community:

  • [ 1 ] [Zhu, Ning]Beijing Univ Technol, Sch Informat Sci & Technol, 100 Pingyuan Pk, Beijing 100124, Peoples R China
  • [ 2 ] [Wang, Shaofan]Beijing Univ Technol, Sch Informat Sci & Technol, 100 Pingyuan Pk, Beijing 100124, Peoples R China
  • [ 3 ] [Sun, Yanfeng]Beijing Univ Technol, Sch Informat Sci & Technol, 100 Pingyuan Pk, Beijing 100124, Peoples R China
  • [ 4 ] [Yin, Baocai]Beijing Univ Technol, Sch Informat Sci & Technol, 100 Pingyuan Pk, Beijing 100124, Peoples R China

Reprint Author's Address:

  • [Wang, Shaofan]Beijing Univ Technol, Sch Informat Sci & Technol, 100 Pingyuan Pk, Beijing 100124, Peoples R China

Show more details

Related Keywords:

Source :

MULTIMEDIA SYSTEMS

ISSN: 0942-4962

Year: 2025

Issue: 3

Volume: 31

3 . 9 0 0

JCR@2022

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 7

Affiliated Colleges:

Online/Total:1055/10574264
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.