Indexed by:
Abstract:
Deep neural network has a significant improvement in accuracy for object detection. A great challenge for deep learning is deployment on low-latency inference systems. Knowledge distillation is widely used for the reduced computational complexity and the compatibility with embedded hardware devices. In this paper, we proposal two adaptive balance distillation methods for object detection, Positive-Feature Balance Distillation and Hard-Feature Balance Distillation. Positive-Feature Balance Distillation helps to alleviate the imbalance of the positive and negative feature and Hard-Feature Balance Distillation is used for forcing the student to focus on the tiny amounts of hard feature in object detection task. These methods are used to improve the accuracy of student networks by transferring knowledge from a complicated teacher's network to a simplified student's network. We conduct comprehensive empirical evaluation with different knowledge distillation configurations over KITTI that is one of the most important automatic driving datasets. The proposed methods outperform consistent improvement in accuracy-speed trade-offs for modern object detection method. © 2022 IEEE.
Keyword:
Reprint Author's Address:
Email:
Source :
Year: 2022
Volume: 2022-October
Page: 272-275
Language: English
Cited Count:
SCOPUS Cited Count: 1
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 6
Affiliated Colleges: