| 注册
首页|期刊导航|重庆工商大学学报(自然科学版)|基于Nesterov加速的改进自适应优化算法

基于Nesterov加速的改进自适应优化算法

钱振 李德权

重庆工商大学学报(自然科学版)2025,Vol.42Issue(3):44-51,8.
重庆工商大学学报(自然科学版)2025,Vol.42Issue(3):44-51,8.DOI:10.16055/j.issn.1672-058X.2025.0003.006

基于Nesterov加速的改进自适应优化算法

An Improved Adaptive Optimization Algorithm Based on Nesterov Acceleration

钱振 1李德权2

作者信息

  • 1. 安徽理工大学数学与大数据学院,安徽淮南 232001
  • 2. 安徽理工大学人工智能学院,安徽淮南 232001
  • 折叠

摘要

Abstract

Objective Traditional optimization algorithms exhibit lower training efficiency when training deep learning models due to increasing model parameters and deeper network layers.To address this issue,a Nadabelief optimization algorithm based on Nesterov acceleration was proposed to improve the efficiency of model training.Methods Firstly,the Adabelief algorithm was employed in place of the Adam algorithm to mitigate the generalization problem.Subsequently,from the perspective of the first-order moment classical momentum term,the Nesterov momentum acceleration mechanism was incorporated into the Adabelief algorithm.During gradient updates,not only the gradient at the current moment was considered,but the historical cumulative gradient was also utilized to adjust the magnitude of gradient updates,so as to further improve the convergence of the algorithm.Finally,the regret bound of the algorithm was obtained based on theoretical analysis to ensure the convergence of the algorithm.Results To verify the performance of the algorithm,Logistic regression experiments were conducted in the convex scenario,while image classification and language modeling experiments were carried out in the non-convex scenario.Comparisons with algorithms such as Adam and Adabelief demonstrated the superiority of the Nadabelief algorithm.Additionally,the algorithm's robustness was confirmed by testing it at various initial learning rates.Conclusion The experiments demonstrate that the proposed algorithm not only maintains the generalization capability of the original Adabelief algorithm but also achieves better convergence accuracy.The proposed algorithm further improves the efficiency when training deep learning models.

关键词

自适应算法/Nesterov动量加速/深度学习/图像识别/语言建模

Key words

adaptive algorithms/Nesterov momentum acceleration/deep learning/image recognition/language modeling

分类

计算机与自动化

引用本文复制引用

钱振,李德权..基于Nesterov加速的改进自适应优化算法[J].重庆工商大学学报(自然科学版),2025,42(3):44-51,8.

基金项目

安徽省学术和技术带头人及后备入选项目(2019H211). (2019H211)

重庆工商大学学报(自然科学版)

1672-058X

访问量0
|
下载量0
段落导航相关论文