• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Yang, Huijing (Yang, Huijing.) | Fang, Juan (Fang, Juan.)

Indexed by:

EI

Abstract:

In modern multi-core computer architectures, composite prefetching techniques exhibit promising potential in efficiently handling diverse memory access patterns. However, the competition among multiple prefetchers for limited shared resources poses challenges to both system performance and fairness. To enhance the efficiency and fairness of multi-core systems, we propose FAPM: a fairness-aware prefetching mechanism based on reinforcement learning (RL) for multi-core systems. FAPM adopts a RL strategy to dynamically adjust the prefetcher's activation and prefetch degree during runtime, enabling dynamic management of multiple prefetchers in the multi-core system. In the design of the reward function, we comprehensively consider the accuracy and timeliness of prefetches on individual cores, as well as the fairness in multi-core systems. Additionally, we propose a new runtime method to calculate a proxy metric for fair-speedup to evaluate the fairness of the system. Through this comprehensive reward mechanism, FAPM effectively optimizes the performance of individual cores and enhances the fairness of the system. Extensive experiments on multi-core systems demonstrate significant improvements. FAPM achieves up to a 17.64% improvement in fair-speedup, which indicates a significant enhancement in system fairness. This approach provides a practical and efficient solution to the resource contention challenge, while also holding the potential to improve fairness and performance in large-scale computing systems. © 2023 IEEE.

Keyword:

Memory architecture Learning systems Reinforcement learning

Author Community:

  • [ 1 ] [Yang, Huijing]Beijing University of Technology, Faculty of Information Technology, Beijing, China
  • [ 2 ] [Fang, Juan]Beijing University of Technology, Faculty of Information Technology, Beijing, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Source :

Year: 2023

Page: 639-646

Language: English

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count: 1

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 5

Affiliated Colleges:

Online/Total:712/10689647
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.