• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Chen, Zhonglin (Chen, Zhonglin.) | Yang, Cuili (Yang, Cuili.) | Qiao, Junfei (Qiao, Junfei.)

Indexed by:

EI Scopus

Abstract:

As an improved recurrent neural network, Long short-term memory (LSTM) neural network have been widely applied in many areas. However, redundant resources of calculation and memory are often required in the dense LSTM neural network and the over-fitting problem can even be caused. It hinders the practical application of network. To enhance the sparsity and generalization ability, the sparse LSTM neural network with hybrid particle swarm optimization algorithm (SLSTM-HPSO) is proposed. Firstly, based on LSTM, the hybrid coding method is established to directly denote the state and value of network weights. Secondly, the fitness function which is composed of the network training error and 1 norm term is introduced to control the accuracy and sparsity of network, simultaneously. Thirdly, the update strategy is designed according to the coding method and fitness function to search for optimal solution. Finally, the proposed method is respectively 47.72% and 47.02% better than traditional LSTM neural network in terms of prediction accuracy and sparsity. © 2021 IEEE

Keyword:

Network coding Particle swarm optimization (PSO) Long short-term memory

Author Community:

  • [ 1 ] [Chen, Zhonglin]Beijing Key Laboratory of Computational Intelligence, Intelligent System, Beijing University of Technology, Beijing, China
  • [ 2 ] [Yang, Cuili]Beijing Key Laboratory of Computational Intelligence, Intelligent System, Beijing University of Technology, Beijing, China
  • [ 3 ] [Qiao, Junfei]Beijing Key Laboratory of Computational Intelligence, Intelligent System, Beijing University of Technology, Beijing, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

Year: 2021

Page: 846-851

Language: English

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 11

Affiliated Colleges:

Online/Total:1255/10613355
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.