计算机工程与应用2024,Vol.60Issue(14):133-143,11.DOI:10.3778/j.issn.1002-8331.2304-0250
动量余弦相似度梯度优化图卷积神经网络
Graph Convolutional Neural Networks Optimized by Momentum Cosine Similarity Gradient
摘要
Abstract
The traditional gradient descent algorithm only uses the exponential weighted accumulation of historical gradients and does not take advantage of the local changes of gradients,which causes the optimization process to cross the global optimal solution.Even if it converges to the optimal solution,it will oscillate near the optimal solution.Using it to train graph convolutional neural network will result in slow convergence speed and low test accuracy.In this paper,the cosine similarity is used to dynamically adjust the learning rate and propose the cosine similarity gradient descent(SimGrad)algorithm.In order to further improve the convergence speed and test accuracy of the graph convolutional neural network training and reduce the oscillation,the momentum cosine similarity gradient descent(NSimGrad)algo-rithm is proposed combined with the momentum idea.The convergence analysis proves the regret bound of SimGrad algo-rithm and NSimGrad algorithm which isO(√T).Test on the three constructed non-convex functions and experiment on four datasets combined with the graph convolutional neural network.Experimental results show that SimGrad algorithm ensures the convergence of graph convolutional neural network,and NSimGrad algorithm further improves the conver-gence speed and test accuracy of graph convolutional neural network training.SimGrad and NSimGrad algorithms have better global convergence and optimization ability than Adam and Nadam.关键词
梯度下降类算法/余弦相似度/图卷积神经网络/遗憾界/全局收敛性Key words
gradient descent algorithm/cosine similarity/graph convolutional neural network/regret/global convergence分类
信息技术与安全科学引用本文复制引用
闫建红,段运会..动量余弦相似度梯度优化图卷积神经网络[J].计算机工程与应用,2024,60(14):133-143,11.基金项目
山西省重点研发计划(202102010101008) (202102010101008)
校研究生教改项目. ()