华中科技大学学报(自然科学版)2024,Vol.52Issue(11):117-124,8.DOI:10.13245/j.hust.240172
基于全局感知的人脸属性编辑
Facial attribute editing based on global perception
摘要
Abstract
Aiming at the issue of correctly editing facial attributes while preserving the details of other attributes,a end-to-end facial attribute editing model—global perception generative adversarial networks(GP-GAN)was proposed based on global perception.The encoder designed a global perception module to replace the traditional convolutional encoder,so that it had both global perception and local perception,and could better edit global and local attributes.Meanwhile,for the problem of attribute disentanglement,an attribute transformer with multi-scale attribute feature fusion module and multilayer perceptron transformer was proposed.The multi-scale attribute feature fusion module could generate rich feature vectors in the potential space,providing conditions for attribute disentanglement.The multilayer perceptron transformer divided the facial attribute feature coding into two parts,which were related attribute and unrelated attribute,and applied orthogonal constraints to ensure the attribute disentanglement.Experiment results show that the proposed model can better edit facial attributes while retaining other attribute details,so that the generated image is more natural and real with higher quality.关键词
人脸属性编辑/多层感知机/全局感知/局部感知/多尺度特征提取Key words
facial attribute editing/multilayer perceptron/global perception/local perception/multi-scale feature extraction分类
信息技术与安全科学引用本文复制引用
顾广华,刘畅,孙文星,窦轶阳..基于全局感知的人脸属性编辑[J].华中科技大学学报(自然科学版),2024,52(11):117-124,8.基金项目
国家自然科学基金资助项目(62072394) (62072394)
河北省自然科学基金资助项目(F2024203049) (F2024203049)
河北省重点实验室资助项目(202250701010046). (202250701010046)