首页|期刊导航|大数据挖掘与分析(英文版)|GPT-NAS:Neural Architecture Search Meets Generative Pre-Trained Transformer Model
大数据挖掘与分析(英文版)2025,Vol.8Issue(1):45-64,20.DOI:10.26599/BDMA.2024.9020036
GPT-NAS:Neural Architecture Search Meets Generative Pre-Trained Transformer Model
GPT-NAS:Neural Architecture Search Meets Generative Pre-Trained Transformer Model
摘要
关键词
Neural Architecture Search(NAS)/Generative Pre-trained Transformer(GPT)model/evolutionary algorithm/image classificationKey words
Neural Architecture Search(NAS)/Generative Pre-trained Transformer(GPT)model/evolutionary algorithm/image classification引用本文复制引用
Caiyang Yu,Xianggen Liu,Yifan Wang,Yun Liu,Wentao Feng,Xiong Deng,Chenwei Tang,Jiancheng Lv..GPT-NAS:Neural Architecture Search Meets Generative Pre-Trained Transformer Model[J].大数据挖掘与分析(英文版),2025,8(1):45-64,20.基金项目
This work was supported by the National Nature Science Foundation of China(No.62106161),the Fundamental Research Funds for the Central Universities(No.1082204112364),the Sichuan University Luzhou Municipal Government Strategic Cooperation Project(No.2022CDLZ-8),the Key R&D Program of Sichuan Province(Nos.2022YFN0017 and 2023YFG0019),the Natural Science Foundation of Sichuan(No.2023NSFSC0474),the Tianfiu Yongxing Laboratory Organized Research Project Funding(No.2023CXXM14),and the Digital Media Art,Key Laboratory of Sichuan Province,Sichuan Conservatory of Music(No.22DMAKL04). (No.62106161)