基于小批量随机梯度下降法的SVM训练隐私保护方案OA北大核心CSTPCD
Privacy-preserving Scheme for SVM Training Based on Mini-batch SGD
使用支持向量机(support vector machine,SVM)处理敏感数据时,隐私保护很重要,已有SVM隐私保护方案基于批量梯度下降法(batch gradient descent,BGD)进行训练,计算开销巨大.针对该问题,提出基于小批量随机梯度下降法(mini-batch stochastic gradient descent,Mini-batch SGD)的SVM隐私保护方案.首先,设计基于Mini-batch SGD的SVM训练算法;然后在此基础上,对模型权重进行乘法扰动,利用大整数分解问题困难假设确保模型的隐私性,使用同态密码体制对数据加密后再执行SVM训练,之后运用同态哈希函数进行验证;最终构建了 SVM隐私保护方案.针对安全威胁,论证了数据隐私性、模型隐私性、模型正确性.对方案进行仿真实验和分析,结果表明,该方案在分类性能接近已有方案的情况下,其计算时间开销平均节约了 92.4%.
When using a support vector machine(SVM)to process sensitive data,privacy protection is very important.The existing SVM privacy-preserving schemes are trained based on batch gradient descent(BGD)algorithm,and they have huge computational overhead.To solve this problem,this paper proposed a privacy-preserving scheme for SVM training based on mini-batch stochastic gradient descent(Mini-batch SGD).Firstly,it designed the SVM training algorithm based on Mini-batch SGD.Then,on this basis,it perturbed the model weights by multiplication,used the hardness assumption of integer factorization to ensure the privacy of the model,engaged the homomorphic cryptosystem to encrypt the data,performed SVM training,and then applied the homomorphic hash function for verification.Finally,it constructed the SVM privacy-preserving scheme.Against security threats,the paper demonstrated data privacy,model privacy,and model correctness.It carried out simulation experiments and analysis of the scheme.The results show that the proposed scheme can save 92.4%of the computation time on average,while the classification performance is close to the existing schemes.
王杰昌;刘玉岭;张平;刘牧华;赵新辉
郑州大学体育学院体育大数据中心 郑州 450044中国科学院信息工程研究所 北京 100085河南科技大学数学与统计学院 河南洛阳 471023||龙门实验室智能系统科创中心 河南洛阳 471023河南科技大学数学与统计学院 河南洛阳 471023
计算机与自动化
小批量随机梯度下降法支持向量机同态加密同态哈希函数隐私保护
Mini-batch SGDSVMhomomorphic encryptionhomomorphic hash functionprivacy-preserving
《信息安全研究》 2024 (010)
967-974 / 8
国家自然科学基金项目(62102134);基础加强计划技术领域基金项目(2021-JCJQ-JJ-0908);河南省科技攻关项目(232102210138,232102210130,232102320309);龙门实验室重大科技项目(231100220300);河南省高等学校重点科研项目(23A520046,23A413005)
评论