• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Hu, Yongli (Hu, Yongli.) | Liu, Mengting (Liu, Mengting.) | Jiang, Huajie (Jiang, Huajie.) | Feng, Lincong (Feng, Lincong.) | Yin, Baocai (Yin, Baocai.)

Indexed by:

CPCI-S

Abstract:

Incremental learning aims to train a model on a sequence of tasks while preserving previously learned knowledge, whereas catastrophic forgetting is a widely-studied problem. To tackle this concern, we design a multi-level knowledge distillation framework (MLKD), which combines coarse-grained and fine-grained distillations to effectively memorize past knowledge. For the coarse-grained distillation, we enforce the model to memorize the neighborhood relationships among samples. For the fine-grained distillation, we aim to memorize the activation logits within each sample. Through the multi-level knowledge distillation, we can learn more robust incremental learning models. In order to assess the efficacy of the MLKD, we perform experiments on two popular incremental learning benchmarks(CIFAR100 and Mini-ImageNet), and our approach achieves good performance.

Keyword:

Catastrophic forgetting Knowledge distillation Incremental learning

Author Community:

  • [ 1 ] [Hu, Yongli]Beijing Univ Technol, Beijing, Peoples R China
  • [ 2 ] [Liu, Mengting]Beijing Univ Technol, Beijing, Peoples R China
  • [ 3 ] [Jiang, Huajie]Beijing Univ Technol, Beijing, Peoples R China
  • [ 4 ] [Feng, Lincong]Beijing Univ Technol, Beijing, Peoples R China
  • [ 5 ] [Yin, Baocai]Beijing Univ Technol, Beijing, Peoples R China

Reprint Author's Address:

  • [Jiang, Huajie]Beijing Univ Technol, Beijing, Peoples R China

Show more details

Related Keywords:

Related Article:

Source :

COMPUTER ANIMATION AND SOCIAL AGENTS, CASA 2024, PT I

ISSN: 1865-0929

Year: 2025

Volume: 2374

Page: 290-305

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 0

Affiliated Colleges:

Online/Total:695/10839435
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.