• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Zhou, Ruizhi (Zhou, Ruizhi.) | Niu, Lingfeng (Niu, Lingfeng.) | Xu, Dachuan (Xu, Dachuan.)

Indexed by:

EI Scopus SCIE

Abstract:

Deep neural networks (DNNs) have shown great success in machine learning tasks and widely used in many fields. However, the substantial computational and storage requirements inherent to DNNs are usually high, which poses challenges for deploying deep learning models on resource-limited devices and hindering further applications. To address this issue, the lightweight nature of neural networks has garnered significant attention, and quantization has become one of the most popular approaches to compress DNNs. In this paper, we introduce a sparse loss-aware ternarization (SLT) model for training ternary neural networks, which encodes the floating-point parameters into {−1,0,1}. Specifically, we abstract the ternarization process as an optimization problem with discrete constraints, and then modify it by applying sparse regularization to identify insignificant weights. To deal with the challenges brought by the discreteness of the model, we decouple discrete constraints from the objective function and design a new algorithm based on the Alternating Direction Method of Multipliers (ADMM). Extensive experiments are conducted on public datasets with popular network architectures. Comparisons with several state-of-the-art baselines demonstrate that SLT always attains comparable accuracy while having better compression performance. © 2024 Elsevier Inc.

Keyword:

Federated learning Adversarial machine learning Neural network models Contrastive Learning

Author Community:

  • [ 1 ] [Zhou, Ruizhi]College of Science, China Agricultural University, Beijing; 100083, China
  • [ 2 ] [Niu, Lingfeng]School of Economics and Management, University of Chinese Academy of Sciences, Beijing; 100190, China
  • [ 3 ] [Xu, Dachuan]Institute of Operations Research and Information Engineering, Beijing University of Technology, Beijing; 100124, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

Information Sciences

ISSN: 0020-0255

Year: 2025

Volume: 693

8 . 1 0 0

JCR@2022

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 14

Affiliated Colleges:

Online/Total:372/10629876
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.