• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Lu, H. (Lu, H..) | Liu, Z. (Liu, Z..) | Zhang, M. (Zhang, M..)

Indexed by:

Scopus

Abstract:

Deep neural network has a significant improvement in accuracy for object detection. A great challenge for deep learning is deployment on low-latency inference systems. Knowledge distillation is widely used for the reduced computational complexity and the compatibility with embedded hardware devices. In this paper, we proposal two adaptive balance distillation methods for object detection, Positive-Feature Balance Distillation and Hard-Feature Balance Distillation. Positive-Feature Balance Distillation helps to alleviate the imbalance of the positive and negative feature and Hard-Feature Balance Distillation is used for forcing the student to focus on the tiny amounts of hard feature in object detection task. These methods are used to improve the accuracy of student networks by transferring knowledge from a complicated teacher's network to a simplified student's network. We conduct comprehensive empirical evaluation with different knowledge distillation configurations over KITTI that is one of the most important automatic driving datasets. The proposed methods outperform consistent improvement in accuracy-speed trade-offs for modern object detection method.  © 2022 IEEE.

Keyword:

object detection adaptive balance knowledge distillation

Author Community:

  • [ 1 ] [Lu H.]North China University of Technology, North China University of Technology, Beijing Polytechnic College, Beijing, China
  • [ 2 ] [Liu Z.]North China University of Technology, North China University of Technology, Beijing Polytechnic College, Beijing, China
  • [ 3 ] [Zhang M.]North China University of Technology, North China University of Technology, Beijing Polytechnic College, Beijing, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Source :

Year: 2022

Volume: 2022-October

Page: 272-275

Language: English

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count: 1

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 7

Affiliated Colleges:

Online/Total:465/10713279
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.