Back to Search Start Over

Deep reinforcement learning-based joint task offloading and bandwidth allocation for multi-user mobile edge computing

Authors :
Xu Feng
Liang Huang
Yuan Wu
Li Ping Qian
Cheng Zhang
Source :
Digital Communications and Networks, Vol 5, Iss 1, Pp 10-17 (2019)
Publication Year :
2019
Publisher :
Elsevier BV, 2019.

Abstract

The rapid growth of mobile internet services has yielded a variety of computation-intensive applications such as virtual/augmented reality. Mobile Edge Computing (MEC), which enables mobile terminals to offload computation tasks to servers located at the edge of the cellular networks, has been considered as an efficient approach to relieve the heavy computational burdens and realize an efficient computation offloading. Driven by the consequent requirement for proper resource allocations for computation offloading via MEC, in this paper, we propose a Deep-Q Network (DQN) based task offloading and resource allocation algorithm for the MEC. Specifically, we consider a MEC system in which every mobile terminal has multiple tasks offloaded to the edge server and design a joint task offloading decision and bandwidth allocation optimization to minimize the overall offloading cost in terms of energy cost, computation cost, and delay cost. Although the proposed optimization problem is a mixed integer nonlinear programming in nature, we exploit an emerging DQN technique to solve it. Extensive numerical results show that our proposed DQN-based approach can achieve the near-optimal performance. Keywords: Mobile edge computing, Joint computation offloading and resource allocation, Deep-Q network

Details

ISSN :
23528648
Volume :
5
Database :
OpenAIRE
Journal :
Digital Communications and Networks
Accession number :
edsair.doi.dedup.....3aa50efdd35622515a66d102c14746ad