西安电子科技大学学报(自然科学版)2024,Vol.51Issue(3):147-157,11.DOI:10.19665/j.issn1001-2400.20240301
求解一类非光滑凸优化问题的相对加速SGD算法
Relatively accelerated stochastic gradient algorithm for a class of non-smooth convex optimization problem
摘要
Abstract
The first order method is widely used in the fields such as machine learning,big data science,computer vision,etc.A crucial and standard assumption for almost all first order methods is that the gradient of the objective function has to be globally Lipschitz continuous,which,however,can't be satisfied by a lot of practical problems.By introducing stochasticity and acceleration to the vanilla GD(Gradient Descent)algorithm,a RASGD(Relatively Accelerated Stochastic Gradient Descent)algorithm is developed,and a wild relatively smooth condition rather than the gradient Lipschitz is needed to be satisfied by the objective function.The convergence of the RASGD is related to the UTSE(Uniformly Triangle Scaling Exponent).To avoid the cost of tuning this parameter,a ARASGD(Adaptively Relatively Accelerated Stochastic Gradient Descent)algorithm is further proposed.The theoretical convergence analysis shows that the objective function values of the iterates converge to the optimal value.Numerical experiments are conducted on the Poisson inverse problem and the minimization problem with the operator norm of Hessian of the objective function growing as a polynomial in variable norm,and the results show that the convergence performance of the ARASGD method and RASGD method is better than that of the RSGD method.关键词
凸优化/非光滑优化/相对光滑/随机规划/梯度方法/加速随机梯度下降Key words
convex optimization/nonsmooth optimization/relatively smooth/stochastic programming/gradient method/accelerated stochastic gradient descent分类
数理科学引用本文复制引用
张文娟,冯象初,肖锋,黄姝娟,李欢..求解一类非光滑凸优化问题的相对加速SGD算法[J].西安电子科技大学学报(自然科学版),2024,51(3):147-157,11.基金项目
陕西省自然科学基础研究计划(2021-JM440) (2021-JM440)
国家自然科学基金(62171361) (62171361)
陕西省重点研发计划(2022GY-119) (2022GY-119)