Fast End-to-End Trainable Guided Filter
文献类型:会议论文
作者 | Wu, Huikai1,3![]() ![]() ![]() |
出版日期 | 2018-06 |
会议日期 | 18-23 June 2018 |
会议地点 | Salt Lake City, UT, USA |
页码 | 1838-1847 |
英文摘要 | Image processing and pixel-wise dense prediction have been advanced by harnessing the capabilities of deep learning. One central issue of deep learning is the limited capacity to handle joint upsampling. We present a deep learning building block for joint upsampling, namely guided filtering layer. This layer aims at efficiently generating the high-resolution output given the corresponding low-resolution one and a high-resolution guidance map. The proposed layer is composed of a guided filter, which is reformulated as a fully differentiable block. To this end, we show that a guided filter can be expressed as a group of spatial varying linear transformation matrices. This layer could be integrated with the convolutional neural networks (CNNs) and jointly optimized through end-to-end training. To further take advantage of end-to-end training, we plug in a trainable transformation function that generates task-specific guidance maps. By integrating the CNNs and the proposed layer, we form deep guided filtering networks. The proposed networks are evaluated on five advanced image processing tasks. Experiments on MIT-Adobe FiveK Dataset demonstrate that the proposed approach runs 10-100Ã- faster and achieves the state-of-the-art performance. We also show that the proposed guided filtering layer helps to improve the performance of multiple pixel-wise dense prediction tasks. The code is available at https://github.com/wuhuikai/DeepGuidedFilter. |
源URL | [http://ir.ia.ac.cn/handle/173211/38528] ![]() |
专题 | 智能系统与工程 |
通讯作者 | Huang, Kaiqi |
作者单位 | 1.University of Chinese Academy of Sciences 2.eBay Research 3.Institute of Automation, Chinese Academy of Sciences |
推荐引用方式 GB/T 7714 | Wu, Huikai,Zheng, Shuai,Zhang, Junge,et al. Fast End-to-End Trainable Guided Filter[C]. 见:. Salt Lake City, UT, USA. 18-23 June 2018. |
入库方式: OAI收割
来源:自动化研究所
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。