Indexed by:
Abstract:
When a sigmoidal feedforward neural network (SFNN) is trained by the gradient-based algorithms, the quality of the overall learning process strongly depends on the initial weights. To improve the algorithm stability and avoid local minima, a Mutual Information based weight initialization (MIWI) method is proposed for SFNN. The useful information contained in input variables is measured with the mutual information (MI) between input variables and output variables. The initial distribution of weights is consistent with the information distribution in the input variables. The lower and upper bounds of the weights range are calculated to ensure the neurons inputs are within the active region of sigmoid function. The MIWI method makes the initial weights close to the global optimal point with a higher probability and avoids premature saturation. The efficiency of the MIWI method is evaluated based on several benchmark problems. The experimental results show that the stability and accuracy of the proposed method are better than some other weight initialization methods. (C) 2016 Elsevier B.V. All rights reserved.
Keyword:
Reprint Author's Address:
Email:
Source :
NEUROCOMPUTING
ISSN: 0925-2312
Year: 2016
Volume: 207
Page: 676-683
6 . 0 0 0
JCR@2022
ESI Discipline: COMPUTER SCIENCE;
ESI HC Threshold:167
CAS Journal Grade:3
Cited Count:
WoS CC Cited Count: 30
SCOPUS Cited Count: 40
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 3
Affiliated Colleges: