中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Neural Radiance Fields From Sparse RGB-D Images for High-Quality View Synthesis

文献类型:期刊论文

作者Yuan, Yu-Jie1,4; Lai, Yu-Kun3; Huang, Yi-Hua1,4; Kobbelt, Leif2; Gao, Lin1,4
刊名IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
出版日期2023-07-01
卷号45期号:7页码:8713-8728
ISSN号0162-8828
关键词Novel view synthesis neural rendering neural radiance fields
DOI10.1109/TPAMI.2022.3232502
英文摘要The recently proposed neural radiance fields (NeRF) use a continuous function formulated as a multi-layer perceptron (MLP) to model the appearance and geometry of a 3D scene. This enables realistic synthesis of novel views, even for scenes with view dependent appearance. Many follow-up works have since extended NeRFs in different ways. However, a fundamental restriction of the method remains that it requires a large number of images captured from densely placed viewpoints for high-quality synthesis and the quality of the results quickly degrades when the number of captured views is insufficient. To address this problem, we propose a novel NeRF-based framework capable of high-quality view synthesis using only a sparse set of RGB-D images, which can be easily captured using cameras and LiDAR sensors on current consumer devices. First, a geometric proxy of the scene is reconstructed from the captured RGB-D images. Renderings of the reconstructed scene along with precise camera parameters can then be used to pre-train a network. Finally, the network is fine-tuned with a small number of real captured images. We further introduce a patch discriminator to supervise the network under novel views during fine-tuning, as well as a 3D color prior to improve synthesis quality. We demonstrate that our method can generate arbitrary novel views of a 3D scene from as few as 6 RGB-D images. Extensive experiments show the improvements of our method compared with the existing NeRF-based methods, including approaches that also aim to reduce the number of input images.
资助项目National Natural Science Foundation of China[62061136007] ; Beijing Municipal Natural Science Foundation for Distinguished Young Scholars[JQ21013] ; Youth Innovation Promotion Association CAS
WOS研究方向Computer Science ; Engineering
语种英语
出版者IEEE COMPUTER SOC
WOS记录号WOS:001004665900051
源URL[http://119.78.100.204/handle/2XEOYT63/21255]  
专题中国科学院计算技术研究所期刊论文_英文
通讯作者Gao, Lin
作者单位1.Chinese Acad Sci, Inst Comp Technol, Beijing Key Lab Mobile Comp & Pervas Device, Beijing 100045, Peoples R China
2.Rhein Westfal TH Aachen, Inst Comp Graph & Multimedia, D-52062 Aachen, Germany
3.Cardiff Univ, Sch Comp Sci & Informat, Cardiff CF10 3AT, Wales
4.Univ Chinese Acad Sci, Beijing 101408, Peoples R China
推荐引用方式
GB/T 7714
Yuan, Yu-Jie,Lai, Yu-Kun,Huang, Yi-Hua,et al. Neural Radiance Fields From Sparse RGB-D Images for High-Quality View Synthesis[J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE,2023,45(7):8713-8728.
APA Yuan, Yu-Jie,Lai, Yu-Kun,Huang, Yi-Hua,Kobbelt, Leif,&Gao, Lin.(2023).Neural Radiance Fields From Sparse RGB-D Images for High-Quality View Synthesis.IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE,45(7),8713-8728.
MLA Yuan, Yu-Jie,et al."Neural Radiance Fields From Sparse RGB-D Images for High-Quality View Synthesis".IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE 45.7(2023):8713-8728.

入库方式: OAI收割

来源:计算技术研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。