Indexed by:
Abstract:
An echo-state network (ESN) is an effective alternative to gradient methods for training recurrent neural network. However, it is difficult to determine the structure (mainly the reservoir) of the ESN to match with the given application. In this paper, a growing ESN (GESN) is proposed to design the size and topology of the reservoir automatically. First, the GESN makes use of the block matrix theory to add hidden units to the existing reservoir group by group, which leads to a GESN with multiple subreservoirs. Second, every subreservoir weight matrix in the GESN is created with a predefined singular value spectrum, which ensures the echo-sate property of the ESN without posterior scaling of the weights. Third, during the growth of the network, the output weights of the GESN are updated in an incremental way. Moreover, the convergence of the GESN is proved. Finally, the GESN is tested on some artificial and real-world time-series benchmarks. Simulation results show that the proposed GESN has better prediction performance and faster leaning speed than some ESNs with fixed sizes and topologies.
Keyword:
Reprint Author's Address:
Source :
IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
ISSN: 2162-237X
Year: 2017
Issue: 2
Volume: 28
Page: 391-404
1 0 . 4 0 0
JCR@2022
ESI Discipline: COMPUTER SCIENCE;
ESI HC Threshold:175
CAS Journal Grade:1
Cited Count:
WoS CC Cited Count: 138
SCOPUS Cited Count: 168
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 3