中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
SOGAN: 3D-Aware Shadow and Occlusion Robust GAN for Makeup Transfer

文献类型:会议论文

作者Yueming Lyu; Jing Dong; Bo Peng; Wei Wang; Tieniu Tan
出版日期2021
会议日期2021-10
会议地点Cheng du, China
英文摘要

In recent years, virtual makeup applications have become more and more popular. However, it is still challenging to propose a robust makeup transfer method in the real-world environment. Current makeup transfer methods mostly work well on good-conditioned clean makeup images, but transferring makeup that exhibits shadow and occlusion is not satisfying. To alleviate it, we propose a novel makeup transfer method, called 3D-Aware Shadow and Occlusion Robust GAN (SOGAN). Given the source and the reference faces, we first fit a 3D face model and then disentangle the faces into shape and texture. In the texture branch, we map the texture to the UV space and design a UV texture generator to transfer the makeup. Since human faces are symmetrical in the UV space, we can conveniently remove the undesired shadow and occlusion from the reference image by carefully designing a Flip Attention Module (FAM). After obtaining cleaner makeup features from the reference image, a Makeup Transfer Module (MTM) is introduced to perform accurate makeup transfer. The qualitative and quantitative experiments demonstrate that our SOGAN not only achieves superior results in shadow and occlusion situations but also performs well in large pose and expression variations.

源URL[http://ir.ia.ac.cn/handle/173211/56614]  
专题自动化研究所_智能感知与计算研究中心
通讯作者Jing Dong
作者单位1.University of Chinese Academy of Sciences
2.Institute of Automation Chinese Academy of Sciences
推荐引用方式
GB/T 7714
Yueming Lyu,Jing Dong,Bo Peng,et al. SOGAN: 3D-Aware Shadow and Occlusion Robust GAN for Makeup Transfer[C]. 见:. Cheng du, China. 2021-10.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。