Back to Search
Start Over
Task offloading for vehicular edge computing with edge-cloud cooperation.
- Source :
-
World Wide Web . Sep2022, Vol. 25 Issue 5, p1999-2017. 19p. - Publication Year :
- 2022
-
Abstract
- Vehicular edge computing (VEC) is emerging as a novel computing paradigm to meet low latency demands for computation-intensive vehicular applications. However, most existing offloading schemes do not take the dynamic edge-cloud computing environment into account, resulting in high delay performance. In this paper, we propose an efficient offloading scheme based on deep reinforcement learning for VEC with edge-cloud computing cooperation, where computation-intensive tasks can be executed locally or can be offloaded to an edge server, or a cloud server. By jointly considering: i) the dynamic edge-cloud computing environment; ii) fast offloading decisions, we leverage deep reinforcement learning to minimize the average processing delay of tasks by effectively integrating the computation resources of vehicles, edge servers, and the cloud server. Specifically, a deep Q-network (DQN) is used to adaptively learn optimal offloading schemes in the dynamic environment by balancing the exploration process and the exploitation process. Furthermore, the offloading scheme can be quickly learned by speeding up the convergence of the training process of DQN, which is good for fast offloading decisions. We conduct extensive simulation experiments and the experimental results show that the proposed offloading scheme can achieve a good performance. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 1386145X
- Volume :
- 25
- Issue :
- 5
- Database :
- Academic Search Index
- Journal :
- World Wide Web
- Publication Type :
- Academic Journal
- Accession number :
- 159530865
- Full Text :
- https://doi.org/10.1007/s11280-022-01011-8