中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Active Pushing for Better Grasping in Dense Clutter with Deep Reinforcement Learning

文献类型:会议论文

作者Lu, Ning1,2; Lu, Tao2; Cai, Yinghao2; Wang, shuo2
出版日期2021
会议日期6-8 Nov. 2020
会议地点Shanghai, China
英文摘要

Robotic grasping in unstructured dense clutter remains a challenging task and has always been a key research direction in the field of robotics. In this paper, we propose a novel robotic grasping system that could use the synergies between pushing and grasping actions to automatically grasp the objects in dense clutter. Our method involves using fully convolutional action-value functions (FCAVF) to map from visual observations to two action-value tables in a Q-learning framework. These two value tables infer the utility of pushing and grasping actions, and the highest value with the corresponding location and orientation means the best place to execute action for the end effector. For better grasping, we introduce an active pushing mechanism based on a new metric, called Dispersion Degree, which describes how spread out the objects are in the environment. Then we design a coordination mechanism to apply the synergies of different actions based on the action-values and dispersion degree of the objects and make the grasps more effective. Experimental results show that our proposed robotic grasping system can greatly improve the robotic grasping success rate in dense clutter and also has the capability to be generalized to the new scenarios.

语种英语
源URL[http://ir.ia.ac.cn/handle/173211/44408]  
专题智能机器人系统研究
通讯作者Lu, Tao
作者单位1.School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing, China
2.Research Center on Intelligent Robotic Systems, Institute of Automation, Chinese Academy of Sciences, Beijing, China
推荐引用方式
GB/T 7714
Lu, Ning,Lu, Tao,Cai, Yinghao,et al. Active Pushing for Better Grasping in Dense Clutter with Deep Reinforcement Learning[C]. 见:. Shanghai, China. 6-8 Nov. 2020.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。