Indexed by:
Abstract:
By integrating the small-world (SW) property into the design of feedforward neural networks, the network performance would be improved by well-documented evidence. To achieve the structural self-adaptation of the feedforward small-world neural networks (FSWNNs), a self-organizing FSWNN, namely SOFSWNN, is proposed based on a hub-based self-organizing algorithm in this paper. Firstly, an FSWNN is constructed according to Watts-Strogatz's rule. Derived from the graph theory, the hub centrality is calculated for each hidden neuron and then used as a measurement for its importance. The self-organizing algorithm is designed by splitting important neurons and merging unimportant neurons with their correlated neurons, and the convergence of this algorithm can be guaranteed theoretically. Extensive experiments are conducted to validate the effectiveness and superiority of SOFSWNN for both classification and regression problems. SOFSWNN achieves an improved generalization performance by SW property and the self-organizing structure. Besides, the hub-based self-organizing algorithm would determine a compact and stable network structure adaptively even from different initial structure. © 2017 IEEE.
Keyword:
Reprint Author's Address:
Email:
Source :
IEEE Transactions on Emerging Topics in Computational Intelligence
ISSN: 2471-285X
Year: 2025
Issue: 1
Volume: 9
Page: 160-175
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 6
Affiliated Colleges: