高技术通讯2025,Vol.35Issue(2):183-197,15.DOI:10.3772/j.issn.1002-0470.2025.02.008
融合人体感知和多模态手势的人机交互方法和系统设计
Human-robot interaction method and system design by fusing human perception and multimodal gestures
摘要
Abstract
Aiming at problems that the existing human-robot interaction(HRI)lacks flexibility and generalization across different tasks since it is limited by the pre-programmed form and cannot perceive human interaction intentions,this paper proposes a HRI method that fuses human perception and multimodal gestures.Firstly,a multimodal hand de-tection method that incorporates human perception is designed.The method takes human poses as a priori to obtain multimodal hand features,dynamically adapts to different detection distances,realizes online detection of multi-per-son interaction gestures,and obtains the corresponding relationship between interaction commands and personnel identities.Secondly,based on the hand detection method,a multimodal interaction gesture dataset is collected,and a general gesture interaction instruction set is constructed.Thirdly,a multimodal gesture recognition method is designed.Data augmentation and gesture rotation mapping are used to reduce the impact of complex scenarios on recognition.Finally,a framework of the proposed HRI method is built.Experimental results indicate that the pro-posed hand detection method has practical usability.The accuracy of the multimodal gesture recognition method reaches over 99%,and its performance is better than that of a single modality and other methods.Typical human-robot interaction tasks,such as collaborative assembly,collaborative transportation,and task point recording and reproduction,verify the feasibility and effectiveness of the proposed HRI method.关键词
人机交互/人体感知/多模态手势识别/交互任务Key words
human-robot interaction/human perception/multimodal gesture recognition/interaction task引用本文复制引用
禹鑫燚,张鑫,许成军,欧林林..融合人体感知和多模态手势的人机交互方法和系统设计[J].高技术通讯,2025,35(2):183-197,15.基金项目
国家自然科学基金(62203392)和浙江省自然科学基金(LY21F030018,LQ22F030021)资助项目. (62203392)