Obstacle Detection in the surrounding Environment of manipulators based on Point Cloud data
文献类型:会议论文
作者 | Hou ZL(侯赵磊)1,2; Jiang Y(姜勇)1,2![]() |
出版日期 | 2019 |
会议日期 | July 29 - August 2, 2019 |
会议地点 | Suzhou, China |
页码 | 1058-1062 |
英文摘要 | In the field of production and life, robots can avoid obstacles autonomously at work, not only to ensure safety, but also to improve their space utilization and work efficiency. The robot can detect obstacles in real time as the basis for collision avoidance. This paper studies the self-identification of the manipulator and the obstacle detection method in the point cloud environment. First, the calibration of the depth camera and the manipulator was performed using the least squares method. Secondly, aiming at the environmental requirements of manipulator collision avoidance and human-machine cooperation, a Locally Convex Connected Patches (LCCP) segmentation method based on kinematics model is proposed to realize the self-identification of the robot. Finally, by setting the control points of the manipulator model, the closest point and the closest distances to the obstacle are obtained. The effectiveness of the method was verified on the Kinect and UR robot. |
产权排序 | 1 |
会议录 | Proceedings of 9th IEEE International Conference on CYBER Technology in Automation, Control, and Intelligent Systems
![]() |
会议录出版者 | IEEE |
会议录出版地 | New York |
语种 | 英语 |
ISBN号 | 978-1-7281-0769-1 |
WOS记录号 | WOS:000569550300183 |
源URL | [http://ir.sia.cn/handle/173321/26837] ![]() |
专题 | 工艺装备与智能机器人研究室 |
通讯作者 | Hou ZL(侯赵磊) |
作者单位 | 1.Institutes for Robotics and Intelligent Manufacturing, Chinese Academy of Sciences, Shenyang 110016, China 2.State Key Laboratory of Robotics, Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang 110016, China |
推荐引用方式 GB/T 7714 | Hou ZL,Jiang Y,Liu GD,et al. Obstacle Detection in the surrounding Environment of manipulators based on Point Cloud data[C]. 见:. Suzhou, China. July 29 - August 2, 2019. |
入库方式: OAI收割
来源:沈阳自动化研究所
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。