Indexed by:
Abstract:
With the arrival of the 5G era, a new service paradigm known as mobile-edge computing (MEC) has been introduced for providing high quality mobile services by offloading the delay-sensitive and computation-intensive tasks from mobile devices to nearby MEC servers. In this paper, we investigate the problem of delaysensitive task scheduling and resource (e.g. CPU, memory) management on the server side in multi-user MEC scenario, and propose a new online algorithm based on deep reinforcement learning (DRL) method to reduce average slowdown and average timeout period of tasks in the queue. We also design a new reward function to guide the algorithm to learn directly from experience to scheduling tasks and managing resources. Simulation result shows that our algorithm outperforms multiple traditional algorithms and have a big advantage of intelligence and good understanding towards workload and environment. © 2019 Association for Computing Machinery.
Keyword:
Reprint Author's Address:
Email:
Source :
Year: 2019
Page: 66-70
Language: English
Cited Count:
WoS CC Cited Count: 0
SCOPUS Cited Count: 11
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 10
Affiliated Colleges: