哈尔滨商业大学学报(自然科学版)2024,Vol.40Issue(2):200-207,8.
带有校正项的自适应梯度下降优化算法
Adaptive gradient descent optimization algorithm with correction term
摘要
Abstract
The batch stochastic gradient descent(SGD)optimization algorithm was commonly used for training convolutional neural networks(CNNs),and its performance directly affected the convergence speed of the neural network.In recent years,some adaptive gradient descent optimization algorithms had been proposed,such as the Adam algorithm and Radam algorithm.However,these optimization algorithms neither utilized the gradient norms of historical iterations nor utilize the second moment of gradients in random subsample.These factors led to slow convergence speed and unstable performance of adaptive gradient descent optimization algorithms.In this paper,a new adaptive gradient descent optimization algorithm called normEve was proposed that combined historical gradient norms and second moment of gradients.Through simulation experiments,the results showed that the new algorithm can effectively improve the convergence speed when historical gradient norms and second moment of gradients were combined.Through test of the new algorithm compared with the Adam optimization algorithm,the accuracy of the new algorithm was higher than that of Adam optimization algorithm,which validated its practical applicability.关键词
梯度下降/神经网络/梯度范数/自适应学习率/分类/优化算法Key words
gradient descent/neural networks/gradient norm/adaptive learning rate/classification/optimization algorithm分类
信息技术与安全科学引用本文复制引用
黄建勇,周跃进..带有校正项的自适应梯度下降优化算法[J].哈尔滨商业大学学报(自然科学版),2024,40(2):200-207,8.基金项目
深部煤矿采动响应与灾害防控国家重点实验室基金资助项目(SKLMRDPC22KF03). (SKLMRDPC22KF03)