• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Luo, D. (Luo, D..) | Zheng, K. (Zheng, K..) | Wu, C. (Wu, C..) | Wang, X. (Wang, X..) | Wang, J. (Wang, J..)

Indexed by:

EI Scopus SCIE

Abstract:

Despite their potential, the industrial deployment of large language models (LLMs) is constrained by traditional fine-tuning procedures that are both resource-intensive and time-consuming. Low-Rank Adaptation (LoRA) has emerged as a pioneering methodology for addressing these challenges. By integrating low-rank decomposition matrices into network weights to reduce trainable parameters, LoRA effectively accelerates the adaptation process. While research on LoRA primarily focuses on adjusting low-rank matrices, DyLoRA optimizes the rank-setting mechanism to avoid extensive effort in rank size training and searching. However, DyLoRA rank configuration mechanism has its own limitation. First, DyLoRA sets the same rank size for all the low-rank adaptation layers at each time step. Given that layers with different depth contain distinct information, they should have varying rank values to accurately capture their unique characteristics. Second, the truncated phase selected for ordering representation based on nested dropout regulation is only half dynamic, continuously dropping tail units, thereby limiting its ability to access information. In this work, we propose a novel technique, enhanced range adaptation in time and depth aware dynamic LoRA (ERAT-DLoRA) to address these problems. The ERAT-DLoRA method introduces a dynamic range to the truncated phase that makes the truncated phase fully dynamic. Additionally, we design a time and layer-aware dynamic rank to ensure appropriate adjustments at different time steps and layer levels. We evaluate our solution on natural languages understanding and language generation tasks. Extensive evaluation results demonstrate the effectiveness of the proposed method. © 2024

Keyword:

LoRA Parameter-efficient Fine-tuning

Author Community:

  • [ 1 ] [Luo D.]Beijing University of Posts and Telecommunications, No. 10 Xitu Cheng Road, Beijing, Beijing, 100876, China
  • [ 2 ] [Zheng K.]Beijing University of Posts and Telecommunications, No. 10 Xitu Cheng Road, Beijing, Beijing, 100876, China
  • [ 3 ] [Wu C.]Beijing University of Posts and Telecommunications, No. 10 Xitu Cheng Road, Beijing, Beijing, 100876, China
  • [ 4 ] [Wang X.]Beijing University of Technology, No. 100 Pingleyuan, Chaoyang District, Beijing City, Beijing, 100124, China
  • [ 5 ] [Wang J.]Beijing University of Posts and Telecommunications, No. 10 Xitu Cheng Road, Beijing, Beijing, 100876, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

Neurocomputing

ISSN: 0925-2312

Year: 2025

Volume: 614

6 . 0 0 0

JCR@2022

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 8

Affiliated Colleges:

Online/Total:441/10651440
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.