• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Han, Jidong (Han, Jidong.) | Zhang, Ting (Zhang, Ting.) | Liu, Zhaoying (Liu, Zhaoying.) | Li, Yujian (Li, Yujian.)

Indexed by:

EI Scopus SCIE

Abstract:

The catastrophic forgetting problem is one of the hotspots in the field of deep learning. At present, there is no doubt that storing samples of previous tasks in fixed-size memory is the best way to solve this problem. However, the number of samples stored in fixed-size memory is limited. With the increase in tasks, the number of samples stored in memory for a single task will decrease sharply. It is also difficult to balance the memory capacity and number of samples. To solve this problem, some methods use a fixed size of memory to store dimensionality reduction images. However, this will create new problems. 1) The quality of dimensionality reduction images is poor, and they are significantly different from original images. 2) How to choose the dimensionality reduction method of images. To address these problem, we put forward a new method. Firstly, we employ a simple and reliable scheme to solve the domain difference between dimensionality reduction images and original images. And we theoretically analyzed which image dimensionality reduction method is better. Secondly, to increase the generalization ability of our method and further mitigate the catastrophic forgetting phenomenon, we utilize a self-supervised image augmentation method and the output features similarity loss. Thirdly, we make use of the neural kernel mapping support vector machine theory to improve the interpretability of our method. Experimental results demonstrated that the top-1 average accuracy of our method is much higher than other methods when using the same size of memory.

Keyword:

Catastrophic forgetting problem Neural kernel mapping support vector machine Fixed-size memory Dimensionality reduction images

Author Community:

  • [ 1 ] [Han, Jidong]Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
  • [ 2 ] [Zhang, Ting]Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
  • [ 3 ] [Liu, Zhaoying]Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
  • [ 4 ] [Li, Yujian]Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
  • [ 5 ] [Li, Yujian]Guilin Univ Elect Technol, Sch Artificial Intelligence, Guilin 541004, Peoples R China

Reprint Author's Address:

  • [Liu, Zhaoying]Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China;;[Li, Yujian]Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China;;[Li, Yujian]Guilin Univ Elect Technol, Sch Artificial Intelligence, Guilin 541004, Peoples R China;;

Show more details

Related Keywords:

Source :

NEURAL COMPUTING & APPLICATIONS

ISSN: 0941-0643

Year: 2023

Issue: 6

Volume: 36

Page: 2767-2796

6 . 0 0 0

JCR@2022

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 10

Affiliated Colleges:

Online/Total:368/10804270
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.