南京邮电大学学报(自然科学版)2024,Vol.44Issue(6):97-107,11.DOI:10.14132/j.cnki.1673-5439.2024.06.010
基于集成DQN的自适应边缘缓存算法
An integrated DQN-based adaptive cache algorithm for edge computing application
摘要
Abstract
In industrial applications,reinforcement learning can hardly achieve a good trade-off between model convergence and knowledge forgetting during the training process,given the dynamic and variable characteristics of data streams.Since the content requests are highly correlated with current production tasks in industrial applications,an adaptive caching strategy based on integrated deep Q-network(IDQN)is proposed.It trains and saves multiple historical task models using different historical task data in the offline phase.In the online phase,the network model is retrained whenever the task features of the real-time data stream are changed.If the features of the real-time data stream are affiliated with any historical tasks,the corresponding historical task model is imported to the deep Q-network(DQN)for network training.Otherwise,the real-time data stream is directly used to train a new task model.The simulation results show that,compared with the reference algorithms,IDQN can effectively reduce the model convergence time and improve the caching efficiency even when the popularity of content requests changes dynamically.关键词
工业边缘网络/缓存替换策略/集成强化学习/深度Q网络Key words
industrial edge network/cache replacement policy/integrated reinforcement learning/deep Q-network(DQN)分类
信息技术与安全科学引用本文复制引用
张雷,李亚文,王晓军..基于集成DQN的自适应边缘缓存算法[J].南京邮电大学学报(自然科学版),2024,44(6):97-107,11.基金项目
国家自然科学基金(61971235,52105553)资助项目 (61971235,52105553)