Indexed by:
Abstract:
Deep neural network (DNN) based on incremental learning provides support for efficient garbage classification tasks. However, it is always challenging to accurately learn and preserve the information of known classes for updating DNN while new tasks are continuously emerging, which also affects the generalization performance of the model. To solve these issues, an incremental evolution learning (IEL) method based on prototype enhancement is proposed to accurately preserve data and improve the model generalization ability. First, a prototype enhancement method based on multi-dimensional Gaussian kernel density estimation is designed, which extends the prototype of each sample based on high-dimensional nonlinear data distribution. Then, the prototype enhancement accurately represents the known class data. Second, a contrastive feature method is proposed to constrain the consistency of features between tasks, which reduces the deviation between different tasks. Then, the extraction preference caused by the class sample imbalance is balanced and the generalization ability is improved. Third, the proposed IEL is used for garbage classification with imbalanced class samples. IEL implements garbage classification by effectively adapting to the differences between known classes and new classes. Finally, experiments on four standard datasets and one public garbage dataset verify that IEL has a strong classification ability in the learning tasks with the increasing number of classes. IEEE
Keyword:
Reprint Author's Address:
Email:
Source :
IEEE Transactions on Artificial Intelligence
ISSN: 2691-4581
Year: 2023
Issue: 1
Volume: 5
Page: 1-14
Cited Count:
WoS CC Cited Count: 0
SCOPUS Cited Count: 5
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 15
Affiliated Colleges: