Indexed by:
Abstract:
Artificial neural networks (ANNs) model has made remarkable achievements in many fields. Therefore, we have greater expectation for it, expecting it to have the same intelligence as human beings. However, ANNs still can't perform continual learning like humans at present. The serious defect of ANNs model is called the catastrophic forgetting problem. For this problem, we put forward a novel method called neural mapping support vector machine based on Parameter Regularization and Knowledge Distillation or RD-NMSVM for short. Our model consists of three parts: firstly, the shared neural network module, which is used to extract common features of different tasks; Secondly, the specific task module, which employs multi-classification support vector machine as classifier, and it is equivalent to using neural networks as neural kernel mapping of support vector machine; Thirdly, the parameter regularization and knowledge distillation module, which inhibits the parameters of the shared network module from updating greatly and learns previous knowledge. Note that RD-NMSVM doesn't utilize samples of previous tasks. From our experiments, we can see that RD-NMSVM has obvious advantages in eliminating catastrophic forgetting of ANNs model.
Keyword:
Reprint Author's Address:
Source :
INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS
ISSN: 1868-8071
Year: 2022
Issue: 9
Volume: 13
Page: 2785-2798
5 . 6
JCR@2022
5 . 6 0 0
JCR@2022
ESI Discipline: COMPUTER SCIENCE;
ESI HC Threshold:46
JCR Journal Grade:2
CAS Journal Grade:3
Cited Count:
WoS CC Cited Count: 1
SCOPUS Cited Count: 1
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 7
Affiliated Colleges: