Indexed by:
Abstract:
Levenberg-Marquardt (LM) algorithm is commonly used for training radial basis function neural networks (RBFNN) due to its fast convergence. However, its successful application is limited by the vanishing gradient problem in the learning process. To solve this problem, an accelerated LM (ALM) algorithm is designed to improve the learning performance of RBFNN in this paper. First, an error information filtering mechanism (EIFM) is developed to discard training samples information that the RBFNN has already learned, then the same amount of computation can obtain the more error gradient information. Second, a width adjustment mechanism (WAM) is designed to help training samples escape from the saturated region of the activation function to increment gradient value. Finally, some benchmark and real problems are used to exhibition the effectiveness of proposed ALM algorithm. The results demonstrate the proposed ALM-RBFNN can effectively mitigate the vanishing gradient and has good learning performance.
Keyword:
Reprint Author's Address:
Source :
2020 CHINESE AUTOMATION CONGRESS (CAC 2020)
ISSN: 2688-092X
Year: 2020
Page: 6804-6809
Language: English
Cited Count:
WoS CC Cited Count: 7
SCOPUS Cited Count: 10
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 10