中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Environmental-structure-perception-based adaptive pose fusion method for LiDAR-visual-inertial odometry

文献类型:期刊论文

作者Zhao, Zixu1,2,3,4; Liu, Chang1,2,4; Yu, Wenyao1,2,3,4; Shi, Jinglin2; Zhang, Dalin2
刊名INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS
出版日期2024-05-01
卷号21期号:3页码:20
关键词Adaptive pose fusion LiDAR-visual-inertial odometry environmental structure perception pose estimation of unmanned vehicle sensor fusion
ISSN号1729-8814
DOI10.1177/17298806241248955
英文摘要Light Detection and Ranging (LiDAR)-visual-inertial odometry can provide accurate poses for the localization of unmanned vehicles working in unknown environments in the absence of Global Positioning System (GPS). Since the quality of poses estimated by different sensors in environments with different structures fluctuates greatly, existing pose fusion models cannot guarantee stable performance of pose estimations in these environments, which brings great challenges to the pose fusion of LiDAR-visual-inertial odometry. This article proposes a novel environmental structure perception-based adaptive pose fusion method, which achieves the online optimization of the parameters in the pose fusion model of LiDAR-visual-inertial odometry by analyzing the complexity of environmental structure. Firstly, a novel quantitative perception method of environmental structure is proposed, and the visual bag-of-words vector and point cloud feature histogram are constructed to calculate the quantitative indicators describing the structural complexity of visual image and LiDAR point cloud of the surroundings, which can be used to predict and evaluate the pose quality from LiDAR/visual measurement models of poses. Then, based on the complexity of the environmental structure, two pose fusion strategies for two mainstream pose fusion models (Kalman filter and factor graph optimization) are proposed, which can adaptively fuse the poses estimated by LiDAR and vision online. Two state-of-the-art LiDAR-visual-inertial odometry systems are selected to deploy the proposed environmental structure perception-based adaptive pose fusion method, and extensive experiments are carried out on both open-source data sets and self-gathered data sets. The experimental results show that environmental structure perception-based adaptive pose fusion method can effectively perceive the changes in environmental structure and execute adaptive pose fusion, improving the accuracy of pose estimation of LiDAR-visual-inertial odometry in environments with changing structures.
资助项目National Key R&D Program of China[2022YFC3320800] ; Zhejiang Provincial Key RD Plan of China[2021C01040]
WOS研究方向Robotics
语种英语
WOS记录号WOS:001216075600001
出版者SAGE PUBLICATIONS INC
源URL[http://119.78.100.204/handle/2XEOYT63/38984]  
专题中国科学院计算技术研究所期刊论文_英文
通讯作者Zhao, Zixu
作者单位1.Beijing Key Lab Mobile Comp & Pervas Device, Beijing, Peoples R China
2.Chinese Acad Sci, Inst Comp Technol, Wireless Commun Technol Res Ctr, Beijing 100190, Peoples R China
3.Univ Chinese Acad Sci, Beijing, Peoples R China
4.Chinese Acad Sci, Inst Comp Technol, State Key Lab Processors, Beijing, Peoples R China
推荐引用方式
GB/T 7714
Zhao, Zixu,Liu, Chang,Yu, Wenyao,et al. Environmental-structure-perception-based adaptive pose fusion method for LiDAR-visual-inertial odometry[J]. INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS,2024,21(3):20.
APA Zhao, Zixu,Liu, Chang,Yu, Wenyao,Shi, Jinglin,&Zhang, Dalin.(2024).Environmental-structure-perception-based adaptive pose fusion method for LiDAR-visual-inertial odometry.INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS,21(3),20.
MLA Zhao, Zixu,et al."Environmental-structure-perception-based adaptive pose fusion method for LiDAR-visual-inertial odometry".INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS 21.3(2024):20.

入库方式: OAI收割

来源:计算技术研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。