Indexed by:
Abstract:
The management of computing resources through the computing power network (CPN) has gradually become a focal point of research. With the development of the 6th generation (6G) mobile networks, some promising technologies such as satellite-terrestrial integrated network (STIN) and smart endogenous network driven by artificial intelligence (AI) are increasingly being applied in Industrial Internet of Things (IIoT). However, several issues in current studies are worthy of attention: 1) the large number of devices powered by battery in IIoT, 2) the complex communication environments, 3) the finite computing resources for task data processing. To cope with these challenges, a satellite-terrestrial integrated computing power network (STICPN) framework is introduced in this article. Within this framework, a task offloading link selection scheme is proposed, which minimizes the delay and the consumption of energy. The task offloading optimization problem is modeled as a Markov Decision Process (MDP). Meanwhile, deep reinforcement learning (DRL) algorithm is employed to adapt to the dynamic states of environment. Specifically, a Dueling Double Deep Q Network (D3QN) is used to make optimal decisions and delay as well as energy consumption can be reduced significantly. Moreover, the D3QN-based scheme extends the usage time of IIoT devices. The simulation results indicate that the proposed scheme outperforms the comparison schemes significantly. © 2025 IEEE.
Keyword:
Reprint Author's Address:
Email:
Source :
IEEE Internet of Things Journal
ISSN: 2327-4662
Year: 2025
1 0 . 6 0 0
JCR@2022
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 7
Affiliated Colleges: