农业大数据学报2025,Vol.7Issue(2):144-154,11.DOI:10.19788/j.issn.2096-6369.000106
AI知识蒸馏技术演进与应用综述
A Review of the Evolution and Applications of AI Knowledge Distillation Technology
摘要
Abstract
Knowledge Distillation(KD)in Artificial Intelligence(AI)achieves model lightweighting through a teacher-student framework,emerging as a key technology to address the performance-efficiency bottleneck in deep learning.This paper systematically analyzes KD's theoretical framework from the perspective of algorithm evolution,categorizing knowledge transfer paths into four paradigms:response-based,feature-based,relation-based,and structure-based.It establishes a comparative evaluation system for dynamic and static KD methods.We deeply explore innovative mechanisms such as cross-modal feature alignment,adaptive distillation architectures,and multi-teacher collaborative validation,while analyzing fusion strategies like progressive knowledge transfer and adversarial distillation.Through empirical analysis in computer vision and natural language processing,we assess KD's practicality in scenarios like image classification,semantic segmentation,and text generation.Notably,we highlight KD's potential in agriculture and geosciences,enabling efficient deployment in resource-constrained settings for precision agriculture and geospatial analysis.Current models often face issues like ambiguous knowledge selection mechanisms and insufficient theoretical interpretability.Accordingly,we discuss the feasibility of automated distillation systems and multimodal knowledge fusion,offering new technical pathways for edge intelligence deployment and privacy computing,particularly suited for agricultural intelligence and geoscience research.关键词
知识蒸馏/模型压缩/知识迁移/动态优化/多模态学习Key words
knowledge distillation/model compression/knowledge transfer/dynamic optimization/multimodal learning引用本文复制引用
毛克彪,代旺,郭中华,孙学宏,肖柳瑞..AI知识蒸馏技术演进与应用综述[J].农业大数据学报,2025,7(2):144-154,11.基金项目
中央级公益性科研院所基本科研业务费专项(No.Y2025YC86) (No.Y2025YC86)
宁夏科技厅自然科学基金重点项目(2024AC02032). (2024AC02032)