Indexed by:
Abstract:
In extreme learning machine (ELM), a large number of hidden nodes are required due to the randomly generated hidden layer. To improve network compactness, the ELM with smoothedl(0)regularizer (ELM-SL0 for short) is studied in this paper. Firstly, thel(0)regularization penalty term is introduced into the conventional error function, such that the unimportant output weights are gradually forced to zeros. Secondly, the batch gradient method and the smoothedl(0)regularizer are combined for training and pruning ELM. Furthermore, both the weak convergence and strong convergence of ELM-SL0 are investigated. Compared with other existing ELMs, the proposed algorithm obtains better performance in terms of estimation accuracy and network sparsity.
Keyword:
Reprint Author's Address:
Email:
Source :
MOBILE NETWORKS & APPLICATIONS
ISSN: 1383-469X
Year: 2020
Issue: 6
Volume: 25
Page: 2434-2446
3 . 8 0 0
JCR@2022
ESI Discipline: COMPUTER SCIENCE;
ESI HC Threshold:132
Cited Count:
WoS CC Cited Count: 3
SCOPUS Cited Count: 5
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 6
Affiliated Colleges: