现代应用物理2025,Vol.16Issue(4):195-202,8.DOI:10.12061/j.issn.2095-6223.202408010
利用Transformer架构求解电磁场分布
Solving Electromagnetic Field Distributions With Transformer Architecture
摘要
Abstract
Inspired by application of large language models in natural language processing and other fields,more and more researchers are exploring attention mechanisms and deep learning to decode relationships between data and quickly solve partial differential equations.Based on previous work JefiAtten(a new type of neural network model based on improved Transformer architecture),this paper conducts further research on solving the electromagnetic field distributions.JefiAtten model uses self-attention and cross-attention modules instead of position encodings in traditional Transformer architecture to understand the interaction between charge density,current density,and electromagnetic field.This paper presents the testing details of this model on a dataset consisting of 500-time sequences constructed using trigonometric functions.The numerical results demonstrate that JefiAtten model can be generalized well to a variety of scenarios and maintain accuracy under different spatial distributions and processing amplitude variations.The model's generalizability indicates its broad application potential in computational physics and can further be improved to enhance its predictive ability and computational efficiency.关键词
注意力机制/电磁场分布/Transformer架构/偏微分方程/深度学习Key words
attention mechanism/electromagnetic field distribution/Transformer architecture/partial differential equation/deep learning分类
信息技术与安全科学引用本文复制引用
田文丽,张俊杰,孙铭言,李振锋,高银军,张冬晓,杜太焦..利用Transformer架构求解电磁场分布[J].现代应用物理,2025,16(4):195-202,8.基金项目
国家自然科学基金资助项目(12405318) (12405318)
国家重大研发计划资助项目(2020YFA0709800) (2020YFA0709800)