计算机工程2026,Vol.52Issue(1):126-135,10.DOI:10.19678/j.issn.1000-3428.0069996
基于对抗训练与对比表示蒸馏的图神经网络推理优化
Graph Neural Network Inference Optimization Based on Adversarial Training and Contrastive Representation Distillation
摘要
Abstract
Graph Neural Network(GNN)excels in node classification tasks,but its message-passing mechanism causes neighbor-fetching latency,limiting deployment in latency-sensitive applications.Despite being less accurate than GNN in node classification tasks,Multi-Layer Perceptron(MLP)is preferred in practical industrial applications owing to its efficient inference.Given the complementary advantages and disadvantages of GNN and MLP,this paper proposes an optimized inference method for GNN based on adversarial training and contrastive representation distillation.This method aims to transfer the knowledge learned from a GNN teacher model to a more efficient MLP student model.This method uses the Fast Gradient Sign Method(FGSM)to generate feature perturbations and combines them with node content features as input for the student model.Adversarial training is conducted under the guidance of real labels and the teacher model's Softmax probability distribution to reduce the student model's sensitivity to node feature noise.The contrastive representation distillation module treats embeddings of the student and teacher models on the same node's output as positive sample pairs and embeddings of different nodes' outputs as negative sample pairs.By minimizing the distance between positive sample pairs and maximizing the distance between negative sample pairs,the student model can capture the relationships between node embeddings output by the teacher model,thereby preserving the global topological structure of GNN.Experiment results on public datasets demonstrate that,when using GraphSAGE as the teacher model,an MLP student model trained by this method achieves an inference speed that is 89 times that of GraphSAGE.Additionally,its accuracy improves by 14.12 and 2.02 percentage points on average compared to those of vanilla MLP and GraphSAGE,respectively,outperforming the two baseline methods.关键词
图神经网络/知识蒸馏/推理加速/对比学习/对抗训练Key words
Graph Neural Network(GNN)/knowledge distillation/inference acceleration/contrastive learning/adversarial training分类
信息技术与安全科学引用本文复制引用
李强,谭兴义,郑唯,刘震,杨文海..基于对抗训练与对比表示蒸馏的图神经网络推理优化[J].计算机工程,2026,52(1):126-135,10.基金项目
湖南省科技计划(2021GK5014). (2021GK5014)