计算机工程与应用2024,Vol.60Issue(24):119-130,12.DOI:10.3778/j.issn.1002-8331.2308-0102
面向知识场景的图片类教育资源知识点自动标注算法
Automatic Annotation of Knowledge Points in Picture-Based Educational Resources for Knowledge Scenarios
摘要
Abstract
Aiming at the challenge of inconsistency between the visual features of picture resources and the semantics of advanced knowledge,a new automatic annotation algorithm for knowledge points is proposed,called the situational hyper-graph convolutional network based on knowledge scenarios(SHGCN),which can efficiently organize and manage picture data,promote knowledge understanding and utilization,and improve education intelligence.The algorithm not only extracts explicit visual features of the picture resources,but also mines knowledge information hidden in fine-grained regions.Faster R-CNN and OCR techniques are utilized to identify knowledge entities such as knowledge objects and coordinate texts,and multi-granularity features are fused to generate knowledge vectors.Then,a dual-screening mechanism is pro-posed to construct different types of knowledge scenarios,and the knowledge scenarios are used as hyperedges to con-struct a situational hypergraph to model higher-order knowledge correlations between images containing similar knowl-edge information.Finally,the hypergraph convolution is used to complete the information aggregation of knowledge-similar pictures,and realize the transformation from"vision-semantic"to"vision-semantic-knowledge".This paper also con-structs a physical picture dataset to train and validate SHGCN.Experimental results show that SHGCN outperforms cur-rent state-of-the-art methods by fusing explicit visual features and implicit knowledge information of pictures.关键词
知识点标注/超图卷积网络/知识场景/情境超图Key words
knowledge point annotation/hypergraph convolutional neural network/knowledge scenarios/situational hypergraph分类
信息技术与安全科学引用本文复制引用
王静,杜旭,李浩,胡壮..面向知识场景的图片类教育资源知识点自动标注算法[J].计算机工程与应用,2024,60(24):119-130,12.基金项目
国家自然科学基金(62177020,62407009) (62177020,62407009)
重庆市教委青年项目(KJQN202400642). (KJQN202400642)