Neural Radiance Fields From Sparse RGB-D Images for High-Quality View Synthesis
文献类型:期刊论文
作者 | Yuan, Yu-Jie1,4; Lai, Yu-Kun3; Huang, Yi-Hua1,4; Kobbelt, Leif2; Gao, Lin1,4 |
刊名 | IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
![]() |
出版日期 | 2023-07-01 |
卷号 | 45期号:7页码:8713-8728 |
关键词 | Novel view synthesis neural rendering neural radiance fields |
ISSN号 | 0162-8828 |
DOI | 10.1109/TPAMI.2022.3232502 |
英文摘要 | The recently proposed neural radiance fields (NeRF) use a continuous function formulated as a multi-layer perceptron (MLP) to model the appearance and geometry of a 3D scene. This enables realistic synthesis of novel views, even for scenes with view dependent appearance. Many follow-up works have since extended NeRFs in different ways. However, a fundamental restriction of the method remains that it requires a large number of images captured from densely placed viewpoints for high-quality synthesis and the quality of the results quickly degrades when the number of captured views is insufficient. To address this problem, we propose a novel NeRF-based framework capable of high-quality view synthesis using only a sparse set of RGB-D images, which can be easily captured using cameras and LiDAR sensors on current consumer devices. First, a geometric proxy of the scene is reconstructed from the captured RGB-D images. Renderings of the reconstructed scene along with precise camera parameters can then be used to pre-train a network. Finally, the network is fine-tuned with a small number of real captured images. We further introduce a patch discriminator to supervise the network under novel views during fine-tuning, as well as a 3D color prior to improve synthesis quality. We demonstrate that our method can generate arbitrary novel views of a 3D scene from as few as 6 RGB-D images. Extensive experiments show the improvements of our method compared with the existing NeRF-based methods, including approaches that also aim to reduce the number of input images. |
资助项目 | National Natural Science Foundation of China[62061136007] ; Beijing Municipal Natural Science Foundation for Distinguished Young Scholars[JQ21013] ; Youth Innovation Promotion Association CAS |
WOS研究方向 | Computer Science ; Engineering |
语种 | 英语 |
WOS记录号 | WOS:001004665900051 |
出版者 | IEEE COMPUTER SOC |
源URL | [http://119.78.100.204/handle/2XEOYT63/21255] ![]() |
专题 | 中国科学院计算技术研究所期刊论文_英文 |
通讯作者 | Gao, Lin |
作者单位 | 1.Chinese Acad Sci, Inst Comp Technol, Beijing Key Lab Mobile Comp & Pervas Device, Beijing 100045, Peoples R China 2.Rhein Westfal TH Aachen, Inst Comp Graph & Multimedia, D-52062 Aachen, Germany 3.Cardiff Univ, Sch Comp Sci & Informat, Cardiff CF10 3AT, Wales 4.Univ Chinese Acad Sci, Beijing 101408, Peoples R China |
推荐引用方式 GB/T 7714 | Yuan, Yu-Jie,Lai, Yu-Kun,Huang, Yi-Hua,et al. Neural Radiance Fields From Sparse RGB-D Images for High-Quality View Synthesis[J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE,2023,45(7):8713-8728. |
APA | Yuan, Yu-Jie,Lai, Yu-Kun,Huang, Yi-Hua,Kobbelt, Leif,&Gao, Lin.(2023).Neural Radiance Fields From Sparse RGB-D Images for High-Quality View Synthesis.IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE,45(7),8713-8728. |
MLA | Yuan, Yu-Jie,et al."Neural Radiance Fields From Sparse RGB-D Images for High-Quality View Synthesis".IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE 45.7(2023):8713-8728. |
入库方式: OAI收割
来源:计算技术研究所
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。