• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Liu, Shunqiang (Liu, Shunqiang.)

Indexed by:

EI Scopus

Abstract:

The model search based on the distillation model framework aims to train the candidate models adequately and guide a correct evaluation of the architecture. This NAS method can easily obtain middle-level monitoring identification indicators, thus significantly improving the effect. However, the model search based on distillation framework also has its shortcomings. First, the supervision indicators differ greatly for various teacher-student models, so how to determine a highly adaptable supervision indicator is a very important issue. Second, different teacher models will introduce biases. Based on the above problems, this paper proposes the following measures. Firstly, this paper adopts a more adaptable supervision index, which can effectively solve the problem that various teacher-student models differ greatly. Secondly, in order to reduce the bias introduced by the teacher model, this paper adopts the largest teacher model as the guidance model in the network training. Finally, this study uses a reinforcement learning algorithm to guide the search in the internal network, and introduce more supervision quantity, which makes the supervision effect between layers more obvious. It can be concluded that the above methods can effectively improve the model performance and consistency. © 2021 SPIE.

Keyword:

Distillation Reinforcement learning Long short-term memory Learning algorithms Personnel training

Author Community:

  • [ 1 ] [Liu, Shunqiang]Division of Life and Environment, Beijing University of Technology (BJUT), Beijing; 100000, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

ISSN: 0277-786X

Year: 2021

Volume: 11911

Language: English

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count: 1

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 5

Affiliated Colleges:

Online/Total:427/10628683
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.