计算机应用研究2023,Vol.40Issue(12):3683-3689,7.DOI:10.19734/j.issn.1001-3695.2022.10.0532
基于交互图神经网络的方面级多模态情感分析
Aspect-level multimodal sentiment analysis based on interaction graph neural network
摘要
Abstract
The key to multimodal sentiment representation is to effectively extract and fuse features from multimodal data.Al-though the method of cross-attention mechanism can enhance the feature fusion of multimodal data.However,cross-attention only establishes the association between the global semantics of a single modality and the local features of another modality,which is not enough to reflect the alignment relationship of multimodalities on local features.In order to obtain in-depth inter-action information between multiple modalities,this paper proposed a modal interaction graph neural network,which connec-ted semantic units of different modalities by means of aspect words to form a multimodal interaction graph.Then,it used the message passing mechanism in the graph attention network to carry out feature fusion.Experimental results on two benchmark datasets show that compared with the current advanced attention models,the modal interaction graph neural network is more effective in realizing the feature interaction between local information,and has a smaller time complexity.关键词
方面级多模态情感分析/模态交互图神经网络/图注意力网络Key words
aspect-level multimodal sentiment analysis/modal interaction graph neural network/graph attention network分类
信息技术与安全科学引用本文复制引用
李丽,李平..基于交互图神经网络的方面级多模态情感分析[J].计算机应用研究,2023,40(12):3683-3689,7.基金项目
国家自然科学基金项目(61873218) (61873218)
西南石油大学创新基地项目(642) (642)