• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Liu, Jiabin (Liu, Jiabin.) | Li, Biao (Li, Biao.) | Lei, Minglong (Lei, Minglong.) | Shi, Yong (Shi, Yong.)

Indexed by:

EI Scopus SCIE

Abstract:

In this paper, we tackle a new learning paradigm called learning from complementary labels, where the training data specifies classes that instances do not belong to, instead of the accuracy labels. In general, it is more efficient to collect the complementary labels compared with collecting the supervised ones, with no need for selecting the correct one from a number of candidates. While current state-of-the-art methods design various loss functions to train competitive models by the limited supervised information, they overlook learning from the data and model themselves, which always contain fruitful information that can improve the performance of complementary label learning. In this paper, we propose a novel learning framework, which seamlessly integrates self-supervised and self-distillation to complementary learning. Based on the general complementary learning framework, we employ an entropy regularization term to guarantee the network outputs exhibit a sharper state. Then, to intensively learn information from the data, we leverage the self-supervised learning based on rotation and transformation operations as a plug-in auxiliary task to learn better transferable representations. Finally, knowledge distillation is introduced to further extract the 'dark knowledge' from a network to guide the training of a student network. In the extensive experiments, our method surprisingly demonstrates compelling performance in accuracy over several state-of-the-art approaches. © 2022 Elsevier Ltd

Keyword:

Distillation Metadata Learning systems

Author Community:

  • [ 1 ] [Liu, Jiabin]School of Information and Electronics, Beijing Institute of Technology, Beijing; 100081, China
  • [ 2 ] [Li, Biao]School of Business Administration, Faculty of Business Administration, Southwestern University of Finance and Economics, Chengdu; 611130, China
  • [ 3 ] [Lei, Minglong]Faculty of Information Technology, Beijing University of Technology, Beijing; 100124, China
  • [ 4 ] [Shi, Yong]School of Economics and Management, University of Chinese Academy of Sciences, Beijing; 100190, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Source :

Neural Networks

ISSN: 0893-6080

Year: 2022

Volume: 155

Page: 318-327

7 . 8

JCR@2022

7 . 8 0 0

JCR@2022

ESI Discipline: COMPUTER SCIENCE;

ESI HC Threshold:46

JCR Journal Grade:1

CAS Journal Grade:2

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count: 10

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 6

Affiliated Colleges:

Online/Total:459/10617054
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.