Omnidirectional Depth Estimation With Hierarchical Deep Network for Multi-Fisheye Navigation Systems
文献类型:期刊论文
作者 | Su, Xiaojie1; Liu, Shimin1; Li, Rui2![]() |
刊名 | IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS
![]() |
出版日期 | 2023-07-25 |
页码 | 12 |
关键词 | Feature extraction Cameras Estimation Task analysis Navigation Costs Semantics Omnidirectional depth estimation hierarchical deep network multi-fisheye navigation system |
ISSN号 | 1524-9050 |
DOI | 10.1109/TITS.2023.3294642 |
通讯作者 | Li, Rui(rui.li@ia.ac.cn) |
英文摘要 | Multi-fisheye System has the advantages of sufficient overlap and the ability to capture a complete 360 $<^>{\circ}$ scene, which is beneficial for the omnidirectional depth estimation task. However, due to the severe distortion of the fisheye images, it is hard for such systems to extract and match features to predict an accurate depth. In this work, on the basis of a multi-fisheye system, we present a novel end-to-end deep learning architecture for omnidirectional depth estimation: 1) to capture the reliable features of the distorted fisheye image, a multi-scale feature extraction and aggregation module is improved, which can adaptively obtain the global context information to represent the features; 2) to leverage more aligned features, especially those in the overlap between multi-fisheye images, we construct a fusion cost volume to combine similarity and semantic information, which can enhance the feature discriminability; and 3) to refine the omnidirectional depth map efficiently, a cascaded cost regularization architecture is proposed. Instead of several costly 3D convolutions, the 3D BSConv based on intra-kernel correlations is introduced to regularize the cost. The proposed method can incrementally predict the depth map from coarse to fine, and reduce the network computational complexity significantly. The experiments in several public indoor and outdoor synthetic datasets demonstrate that the proposed method outperforms some state-of-the-art methods in terms of a synthesis of accuracy and speed, with fewer model parameters. |
资助项目 | National Key Research and Development Program of China[2022YFE0107300] ; National Natural Science Foundation of China[62003059] ; Chongqing Human Resources and Social Bureau[cx2022064] ; Graduate Research and Innovation Foundation of Chongqing, China[CYS22117] |
WOS研究方向 | Engineering ; Transportation |
语种 | 英语 |
WOS记录号 | WOS:001040607300001 |
出版者 | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC |
资助机构 | National Key Research and Development Program of China ; National Natural Science Foundation of China ; Chongqing Human Resources and Social Bureau ; Graduate Research and Innovation Foundation of Chongqing, China |
源URL | [http://ir.ia.ac.cn/handle/173211/53858] ![]() |
专题 | 多模态人工智能系统全国重点实验室 |
通讯作者 | Li, Rui |
作者单位 | 1.Chongqing Univ, Sch Automat, Chongqing 400044, Peoples R China 2.Chinese Acad Sci, Inst Automat, State Key Lab Multimodal Artificial Intelligence S, Beijing 100190, Peoples R China |
推荐引用方式 GB/T 7714 | Su, Xiaojie,Liu, Shimin,Li, Rui. Omnidirectional Depth Estimation With Hierarchical Deep Network for Multi-Fisheye Navigation Systems[J]. IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS,2023:12. |
APA | Su, Xiaojie,Liu, Shimin,&Li, Rui.(2023).Omnidirectional Depth Estimation With Hierarchical Deep Network for Multi-Fisheye Navigation Systems.IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS,12. |
MLA | Su, Xiaojie,et al."Omnidirectional Depth Estimation With Hierarchical Deep Network for Multi-Fisheye Navigation Systems".IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS (2023):12. |
入库方式: OAI收割
来源:自动化研究所
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。