计算机工程与应用2011,Vol.47Issue(12):200-202,212,4.DOI:10.3778/j.issn.1002-8331.2011.12.056
用LDLT并行分解优化大规模SVM的训练效率
Improving SVM's learning efficiency by using matrix LDLT parallel decomposition
摘要
Abstract
If the support vector machine is trained on large-scale datasets,the training time will be longer and generalization capability will be descended. Path following interior point method is proposed to design the SVM's learning algorithm on large-scale datasets, and the key point being negative impact on SVM's learning efficiency on large-scale datasets is to solve the large-scale iterative direction equations efficiently. To improve the SVM's learning efficiency,the dimensions of direction equations are degraded,then LDLT parallel decomposition method is used to solve the direction sub-equations efficiently.The experimental results show that the new SVM's training algorithm is efficient for large-scale datasets and the generalization capacity of SVM is not affected.关键词
大规模支持向量机/路径跟踪内点法/矩阵/LDLT/并行分解Key words
large-scale support vector machine/path following method/matrix LDLT parallel decomposition分类
信息技术与安全科学引用本文复制引用
覃华,徐燕子..用LDLT并行分解优化大规模SVM的训练效率[J].计算机工程与应用,2011,47(12):200-202,212,4.基金项目
广西高校人才小高地建设创新团队计划基金资助项目(No.桂教人[2007]71). (No.桂教人[2007]71)