Indexed by:
Abstract:
As an extensively used model for time series prediction, the Long–Short Term Memory (LSTM) neural network suffers from shortcomings such as high computational cost and large memory requirement, due to its complex structure. To address these problems, a PLS-based pruning algorithm is hereby proposed for a simplified LSTM (PSLSTM). First, a hybrid strategy is designed to simplify the internal structure of LSTM, which combines the structure simplification and parameter reduction for gates. Second, partial least squares (PLS) regression coefficients are used as the metric to evaluate the importance of the memory blocks, and the redundant hidden layer size is pruned by merging unimportant blocks with their most correlated ones. The Backpropagation Through Time (BPTT) algorithm is utilized as the learning algorithm to update the network parameters. Finally, several benchmark and practical datasets for time series prediction are used to evaluate the performance of the proposed PSLSTM. The experimental results demonstrate that the PLS-based pruning algorithm can achieve the trade-off between a good generalization ability and a compact network structure. The computational complexity is improved by the simple internal structure as well as the compact hidden layer size, without sacrificing prediction accuracy. © 2022 Elsevier B.V.
Keyword:
Reprint Author's Address:
Email:
Source :
Knowledge-Based Systems
ISSN: 0950-7051
Year: 2022
Volume: 254
8 . 8
JCR@2022
8 . 8 0 0
JCR@2022
ESI Discipline: COMPUTER SCIENCE;
ESI HC Threshold:46
JCR Journal Grade:1
CAS Journal Grade:2
Cited Count:
WoS CC Cited Count: 0
SCOPUS Cited Count: 22
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 7
Affiliated Colleges: