中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Learning Realistic and Reasonable Grasps for Anthropomorphic Hand in Cluttered Scenes

文献类型:会议论文

作者Duan, Haonan; Li, Yiming; LI, Daheng; Wei, Wei; Huang, Yayu; Wang, Peng
出版日期2024-05-17
会议日期2024-5-13
会议地点Yokohama, Japan
关键词Robotic grasping Anthropomorphic hand Affordance
英文摘要

Grasping is one of the most fundamental skills for humans to interact with objects. However, it remains a challenging problem for anthropomorphic hands, due to the lack of object affordance understanding and high-dimensional grasp planning. In this work, we propose an anthropomorphic hand grasping framework to learn realistic and reasonable grasps in cluttered scenes, which tackles the problem in three items: 1) graspable point segmentation; 2) hand grasp generation and 3) grasp optimization. Specifically, our method generates high-quality hand grasps efficiently without complete object models by learning graspable points, associated grasp configurations from observed point cloud in a parallel manner and optimizing predicted grasps based on hand-object contacts. Simulation experiments show that our model generates physical plausible grasps for the anthropomorphic hand effectively with over 70% success rate. Real-world experiments demonstrate that the model trained in simulation performs satisfactorily in real-world scenarios for unseen objects.

语种英语
源URL[http://ir.ia.ac.cn/handle/173211/56667]  
专题智能机器人系统研究
通讯作者Wang, Peng
作者单位1.School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing, China
2.Centre for Artificial Intelligence and Robotics, Hong Kong Institutue of Science and Innovation, Chinese Academy of Sciences, Hong Kong, China
3.Institute of Automation, Chinese Academy of Sciences, Beijing, China
推荐引用方式
GB/T 7714
Duan, Haonan,Li, Yiming,LI, Daheng,et al. Learning Realistic and Reasonable Grasps for Anthropomorphic Hand in Cluttered Scenes[C]. 见:. Yokohama, Japan. 2024-5-13.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。