中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
面向空间在轨维护的对接环目标测量方法研究

文献类型:学位论文

作者李展; 靳冠坤
学位类别硕士
答辩日期2018-05-17
授予单位中国科学院沈阳自动化研究所
授予地点沈阳
导师郝颖明
关键词在轨维护 结构光测量 目标识别 位姿解算
其他题名Research of Docking Ring Target Measurement Method for Space On Orbit Maintenance
学位专业模式识别与智能系统
中文摘要随着航天技术的进步,空间机械手已成为空间在轨维护与服务的重要工具。为实现空间机械手对目标卫星的自动抓捕,必须不断测量出机械手末端与待抓捕目标间的相对位置和姿态,作为机械手闭环控制的输入。本文面向空间机械手对目标航天器的自动抓捕,以空间航天器普遍存在的星箭对接环作为抓捕对象,开展基于结构光的相对位姿测量方法研究。提出了一种基于多线结构光的近距离、大尺度圆环目标位姿测量方法,可以适应空间机械手对星箭对接环自动抓捕的需求。首先,面向空间机械手对星箭对接环自动抓捕的应用需求,基于双线结构光测量方案,通过对抓捕对象的分析,选择出用于位姿求解的目标特征——直线特征与点特征的组合。基于结构光的对接环相对位姿测量过程可以分为目标识别与特征提取、相对位姿求解两个步骤。在目标识别与特征提取阶段,根据选择的目标特征,提出了复杂背景下的结构光图像目标的识别与特征提取方法。该方法用区域生长的思想提取激光光条区域;利用激光光条自身形状约束和光条间的几何关系约束识别目标光条;采用直线拟合获得光条直线方程。在相对位姿求解阶段,针对所选择的目标特征,首先提出了单套结构光位姿求解方法。该方法在利用结构光获得光条特征点三维坐标的基础上,采用平面拟合求解圆环平面法向量;巧用中间坐标系求解圆环圆心坐标。在单套结构光位姿求解方法的基础上,利用双套结构光提供的冗余信息,通过优化求解提高了测量精度。基于上述测量方法,建立了基于结构光的对接环位姿测量实验系统,开发了视觉测量软件,在多个实验场景下进行了位姿测量实验验证并完成了精度测试。实验结果表明本文提出的方法对环境因素适应性较强并且测量精度较高。综上,本文提出的基于结构光的对接环位姿测量方法面向空间机械手对星箭对接环的自动抓捕,在对接环仅能在测量相机中部分成像的条件下,能够实现机械手与对接环间相对位姿的高精度测量,且对环境变化具有较好的适应性。该方法已在实际工程中得到应用。
英文摘要With the development of space technology, space manipulator has become an important tool for space maintenance and on orbit service. In order to capture the target satellite automatically, the space manipulator should measure the relative position and attitude between the end of the manipulator and the target continuously as the input of the closed loop control of the manipulator. This paper focuses on the automatic capture of the target spacecraft by the space manipulator, and researches the relative position measurement method based on the structure light, using the docking rings which are commonly exist on space spacecraft as the capture object. It proposes a position measurement method which meets the requirement of the space manipulator for automatic capture of large scale docking ring target at a close range based on multi-line structure light. First, it selects the combination of the line feature and the point feature through the analysis of the captured object based on line structure light measurement system. It proposed a method of target recognition and feature extraction in structured light image under complex background based on the selected target characteristics. This method is used to extract laser stripe region by region growing; the object stripe is identified by the constraint of the shape constraint of the laser stripe and the geometric relationship between the light stripes and then fits the line. A pose solution algorithm for single set of structured light is proposed based on the target characteristics. It calculate the circular plane normal vector by plane fitting to get three dimensional coordinates of light stripe feature points, using the intermediate coordinate system to solve the circular center coordinates. Based on the solution of the single set of structure light system, double sets can provide redundant information which is used to improve the accuracy of measurement. Up to the measurement method, we established an experiment system based on structure light, and developed a visual measurement software. The posture measurement experiment is carried out and the accuracy test is completed in several experimental scenes. It shoes that the proposed method is environmental adaptable and with high accuracy through the experiment results. Above all, it proposed a method based on structure-light that is oriented to capture the space manipulator to the docking ring automatically. In the condition that the docking ring can only be partially imaging in the measurement camera, the high precision measurement of the relative position between the manipulator and the docking ring can be realized, and it has a good adaptation to the environment change. The method has been applied in practical engineering projects.
语种中文
产权排序1
页码78页
源URL[http://ir.sia.cn/handle/173321/21799]  
专题沈阳自动化研究所_数字工厂研究室
推荐引用方式
GB/T 7714
李展,靳冠坤. 面向空间在轨维护的对接环目标测量方法研究[D]. 沈阳. 中国科学院沈阳自动化研究所. 2018.

入库方式: OAI收割

来源:沈阳自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。