• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Wang, GongMing (Wang, GongMing.) | Qiao, JunFei (Qiao, JunFei.) (Scholars:乔俊飞) | Bi, Jing (Bi, Jing.) | Li, WenJing (Li, WenJing.) | Zhou, MengChu (Zhou, MengChu.)

Indexed by:

EI Scopus SCIE

Abstract:

A deep belief network (DBN) is effective to create a powerful generative model by using training data. However, it is difficult to fast determine its optimal structure given specific applications. In this paper, a growing DBN with transfer learning (TL-GDBN) is proposed to automatically decide its structure size, which can accelerate its learning process and improve model accuracy. First, a basic DBN structure with single hidden layer is initialized and then pretrained, and the learned weight parameters are frozen. Second, TL-GDBN uses TL to transfer the knowledge from the learned weight parameters to newly added neurons and hidden layers, which can achieve a growing structure until the stopping criterion for pretraining is satisfied. Third, the weight parameters derived from pretraining of TL-GDBN are further fine-tuned by using layer-by-layer partial least square regression from top to bottom, which can avoid many problems of traditional backpropagation algorithm-based fine-tuning. Moreover, the convergence analysis of the TL-GDBN is presented. Finally, TL-GDBN is tested on two benchmark data sets and a practical wastewater treatment system. The simulation results show that it has better modeling performance, faster learning speed, and more robust structure than existing models. Note to Practitioners-Transfer learning (TL) aims to improve training effectiveness by transferring knowledge from a source domain to target domain. This paper presents a growing deep belief network (DBN) with TL to improve the training effectiveness and determine the optimal model size. Facing a complex process and real-world workflow, DBN tends to require long time for its successful training. The proposed growing DBN with TL (TL-GDBN) accelerates the learning process by instantaneously transferring the knowledge from a source domain to each new deeper or wider substructure. The experimental results show that the proposed TL-GDBN model has a great potential to deal with complex system, especially the systems with high nonlinearity. As a result, it can be readily applicable to some industrial nonlinear systems.

Keyword:

Convergence analysis deep belief network (DBN) TL partial least square regression (PLSR)-based fine-tuning growing DBN with transfer learning (TL-GDBN)

Author Community:

  • [ 1 ] [Wang, GongMing]Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
  • [ 2 ] [Qiao, JunFei]Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
  • [ 3 ] [Li, WenJing]Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
  • [ 4 ] [Wang, GongMing]Beijing Key Lab Computat Intelligence & Intellige, Beijing 100124, Peoples R China
  • [ 5 ] [Qiao, JunFei]Beijing Key Lab Computat Intelligence & Intellige, Beijing 100124, Peoples R China
  • [ 6 ] [Li, WenJing]Beijing Key Lab Computat Intelligence & Intellige, Beijing 100124, Peoples R China
  • [ 7 ] [Bi, Jing]Beijing Univ Technol, Sch Software Engn, Fac Informat Technol, Beijing 100124, Peoples R China
  • [ 8 ] [Zhou, MengChu]New Jersey Inst Technol, Dept Elect & Comp Engn, Newark, NJ 07102 USA

Reprint Author's Address:

  • [Bi, Jing]Beijing Univ Technol, Sch Software Engn, Fac Informat Technol, Beijing 100124, Peoples R China

Show more details

Related Keywords:

Source :

IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING

ISSN: 1545-5955

Year: 2019

Issue: 2

Volume: 16

Page: 874-885

5 . 6 0 0

JCR@2022

ESI Discipline: ENGINEERING;

ESI HC Threshold:136

Cited Count:

WoS CC Cited Count: 104

SCOPUS Cited Count: 121

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 7

Online/Total:852/10621985
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.