中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
热门
AttGAN: Facial Attribute Editing by Only Changing What You Want

文献类型:期刊论文

作者He, Zhenliang2,3; Zuo, Wangmeng1; Kan, Meina2,3; Shan, Shiguang2,3,4,5; Chen, Xilin2,3
刊名IEEE TRANSACTIONS ON IMAGE PROCESSING
出版日期2019-11-01
卷号28期号:11页码:5464-5478
关键词Facial attribute editing attribute style manipulation adversarial learning
ISSN号1057-7149
DOI10.1109/TIP.2019.2916751
英文摘要Facial attribute editing aims to manipulate single or multiple attributes on a given face image, i.e., to generate a new face image with desired attributes while preserving other details. Recently, the generative adversarial net (GAN) and encoder-decoder architecture are usually incorporated to handle this task with promising results. Based on the encoder-decoder architecture, facial attribute editing is achieved by decoding the latent representation of a given face conditioned on the desired attributes. Some existing methods attempt to establish an attribute-independent latent representation for further attribute editing. However, such attribute-independent constraint on the latent representation is excessive because it restricts the capacity of the latent representation and may result in information loss, leading to over-smooth or distorted generation. Instead of imposing constraints on the latent representation, in this work, we propose to apply an attribute classification constraint to the generated image to just guarantee the correct change of desired attributes, i.e., to change what you want. Meanwhile, the reconstruction learning is introduced to preserve attribute-excluding details, in other words, to only change what you want. Besides, the adversarial learning is employed for visually realistic editing. These three components cooperate with each other forming an effective framework for high quality facial attribute editing, referred as AttGAN. Furthermore, the proposed method is extended for attribute style manipulation in an unsupervised manner. Experiments on two wild datasets, CelebA and LFW, show that the proposed method outperforms the state-of-the-art on realistic attribute editing with other facial details well preserved.
资助项目National Key R&D Program of China[2017YFA0700800] ; Natural Science Foundation of China[61671182] ; Natural Science Foundation of China[61772496] ; Natural Science Foundation of China[61732004]
WOS研究方向Computer Science ; Engineering
语种英语
WOS记录号WOS:000482600600017
出版者IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
源URL[http://119.78.100.204/handle/2XEOYT63/4731]  
专题中国科学院计算技术研究所期刊论文_英文
通讯作者Shan, Shiguang
作者单位1.Harbin Inst Technol, Sch Comp Sci & Technol, Harbin 150001, Heilongjiang, Peoples R China
2.Chinese Acad Sci, Inst Comp Technol, Key Lab Intelligent Informat Proc, Beijing 100190, Peoples R China
3.Univ Chinese Acad Sci, Beijing 100049, Peoples R China
4.CAS Ctr Excellence Brain Sci & Intelligence Techn, Shanghai 200031, Peoples R China
5.Peng Cheng Lab, Shenzhen 518055, Peoples R China
推荐引用方式
GB/T 7714
He, Zhenliang,Zuo, Wangmeng,Kan, Meina,et al. AttGAN: Facial Attribute Editing by Only Changing What You Want[J]. IEEE TRANSACTIONS ON IMAGE PROCESSING,2019,28(11):5464-5478.
APA He, Zhenliang,Zuo, Wangmeng,Kan, Meina,Shan, Shiguang,&Chen, Xilin.(2019).AttGAN: Facial Attribute Editing by Only Changing What You Want.IEEE TRANSACTIONS ON IMAGE PROCESSING,28(11),5464-5478.
MLA He, Zhenliang,et al."AttGAN: Facial Attribute Editing by Only Changing What You Want".IEEE TRANSACTIONS ON IMAGE PROCESSING 28.11(2019):5464-5478.

入库方式: OAI收割

来源:计算技术研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。