Indexed by:
Abstract:
Continual learning, also known as lifelong learning, addresses the fundamental challenge of enabling machine learning models to learn from a continuous stream of data and adapt their knowledge to changing environments, without forgetting previously acquired information. This paradigm emulates the human ability to accumulate knowledge and skills over time, allowing models to build upon their past experiences to enhance their performance in new tasks. To this end, continual learning algorithms need to learn robust and stable representations that can generalize to new data. In this paper, we present an algorithm which can learn more concise and semantic representations that are beneficial for continual learning based on information bottleneck and metric learning. Combined with Core Set which leave a small part of data from previous tasks, our algorithm attains high performance on benchmark datasets: MNIST in solving continual incremental tasks. © 2023 IEEE.
Keyword:
Reprint Author's Address:
Email:
Source :
Year: 2023
Page: 8824-8828
Language: English
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 14
Affiliated Colleges: