中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
BFRFormer: Transformer-based generator for Real-World Blind Face Restoration

文献类型:会议论文

作者Guojing Ge1,5; Qi Song6; Guibo Zhu1,3,4,5; Yuting Zhang2; Jinglu Chen5; Miao Xin5; Ming Tang5; Jinqiao Wang1,3,5
出版日期2024-04-14
会议日期2024年4月14日到2024年4月19日
会议地点Seoul, Korea
英文摘要
Blind face restoration is a challenging task due to the unknown and complex degradation. Although face prior-based methods and reference-based methods have recently demonstrated high-quality results, the restored images tend to contain over-smoothed results and lose identity-preserved details when the degradation is severe. It is observed that this is attributed to short-range dependencies, the intrinsic limitation of convolutional neural networks. To model long-range dependencies, we propose a Transformer-based blind face
restoration method, named BFRFormer, to reconstruct images with more identity-preserved details in an end-to-end manner. In BFRFormer, to remove blocking artifacts, the wavelet discriminator and aggregated attention module are developed, and spectral normalization and balanced consistency regulation are adaptively applied to address the training instability and over-fitting problem, respectively. Extensive
experiments show that our method outperforms state-of-the-art methods on a synthetic dataset and four real-world datasets. The source code, Casia-Test dataset, and pre-trained
models is released at https://github.com/s8Znk/BFRFormer.
源URL[http://ir.ia.ac.cn/handle/173211/57281]  
专题紫东太初大模型研究中心
作者单位1.Wuhan AI Research
2.China Telecom Corporation Ltd
3.University of Chinese Academy of Sciences
4.Shanghai Artificial Intelligence Laboratory
5.Institute of Automation, Chinese Academy of Sciences
6.Hong Kong Baptist University
推荐引用方式
GB/T 7714
Guojing Ge,Qi Song,Guibo Zhu,et al. BFRFormer: Transformer-based generator for Real-World Blind Face Restoration[C]. 见:. Seoul, Korea. 2024年4月14日到2024年4月19日.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。