火力与指挥控制2024,Vol.49Issue(7):44-49,6.DOI:10.3969/j.issn.1002-0640.2024.07.007
MEC网络中基于深度确定策略梯度的能效优化
Deep Deterministic Policy Gradient-based Energy Efficiency Optimization Algorithm for MEC Networks
摘要
Abstract
Although Mobile Edge Computing(MEC)technology can provide users with data process-ing services,the computing resources of MEC servers are also limited.Therefore,reasonable migration of tasks from users to MEC servers and reasonable allocation of resources from MEC servers to users based on task requirements are key factors to improve user energy efficiency.To solve this problem,a Deep Deterministic Policy Gradient-based Energy Efficiency Optimization(DDPG-EEO)algorithm is proposed.Under the premise of meeting with the time delay requirements,the optimization problem about the maximum energy efficiency of task offloading rate and resource allocation strategy is estab-lished.Then,the optimization problem is described as Markov Decision Process(MDP)and solved by Deterministic Policy Gradient-based.The simulation results show that the DDPG-EEO algorithm re-duces the energy consumption of UTs and improves the task accomplishment rate.关键词
移动边缘计算/任务卸载/资源分配/强化学习/深度确定策略梯度Key words
mobile edge computing/task offloading/resources allocation/reinforcement learning/deep deterministic policy gradient分类
信息技术与安全科学引用本文复制引用
陈卡..MEC网络中基于深度确定策略梯度的能效优化[J].火力与指挥控制,2024,49(7):44-49,6.基金项目
河南省科技攻关计划项目(212102210516) (212102210516)
河南省软科学研究计划项目(182400410608) (182400410608)
河南省高等教育教学改革研究与实践立项项目(2021SJGLX865) (2021SJGLX865)