Indexed by:
Abstract:
Gradient-based algorithms are commonly used for training radial basis function neural network (RBFNN). However, it is still difficult to avoid vanishing gradient to improve the learning performance in the training process. For this reason, in this paper, an accelerated second-order learning (ASOL) algorithm is developed to train RBFNN. First, an adaptive expansion and pruning mechanism (AEPM) of gradient space, based on the integrity and orthogonality of hidden neurons, is designed. Then, the effective gradient information is constantly added to gradient space and the redundant gradient information is eliminated from gradient space. Second, with AEPM, the neurons are generated or pruned accordingly. In this way, a self-organizing RBFNN (SORBFNN) which reduces the structure complexity and improves the generalization ability is obtained. Then, the structure and parameters in the learning process can be optimized by the proposed ASOL-based SORBFNN (ASOL-SORBFNN). Third, some theoretical analyses including the efficiency of the proposed AEPM on avoiding the vanishing gradient and the stability of SORBFNN in the process of structural adjustment are given, then the successful application of the proposed ASOL-SORBFNN is guaranteed. Finally, to illustrate the advantages of the proposed ASOL-SORBFNN, several experimental studies are examined. By comparing with other existing approaches, the results show that ASOL-SORBFNN performs well in terms of both learning speed and prediction accuracy. © 2021 Elsevier B.V.
Keyword:
Reprint Author's Address:
Email:
Source :
Neurocomputing
ISSN: 0925-2312
Year: 2022
Volume: 469
Page: 1-12
6 . 0
JCR@2022
6 . 0 0 0
JCR@2022
ESI Discipline: COMPUTER SCIENCE;
ESI HC Threshold:46
JCR Journal Grade:2
CAS Journal Grade:2
Cited Count:
SCOPUS Cited Count: 30
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 8
Affiliated Colleges: