中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Learning Human-to-Robot Dexterous Handovers for Anthropomorphic Hand

文献类型:期刊论文

作者Duan, Haonan3,4; Wang, Peng1,2,3,4; Li, Yiming3,4; Li, Daheng3,4; Wei, Wei3,4
刊名IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS
出版日期2023-09-01
卷号15期号:3页码:1224-1238
ISSN号2379-8920
关键词Anthropomorphic hand handovers human-robot interaction
DOI10.1109/TCDS.2022.3203025
通讯作者Wang, Peng(peng_wang@ia.ac.cn)
英文摘要Human-robot interaction plays an important role in robots serving human production and life. Object handover between humans and robotics is one of the fundamental problems of human-robot interaction. The majority of current work uses parallel-jaw grippers as the end-effector device, which limits the ability of the robot to grab miscellaneous objects from human and manipulate them subsequently. In this article, we present a framework for human-to-robot dexterous handover using an anthropomorphic hand. The framework takes images captured by two cameras to complete handover scene understanding, grasp configurations prediction, and handover execution. To enable the robot to generalize to diverse delivered objects with miscellaneous shapes and sizes, we propose an anthropomorphic hand grasp network (AHG-Net), an end-to-end network that takes the single-view point clouds of the object as input and predicts the suitable anthropomorphic hand configurations with five different grasp taxonomies. To train our model, we build a large-scale data set with 1M hand grasp annotations from 5K single-view point clouds of 200 objects. We implement a handover system using a UR5 robot arm and HIT-DLR II anthropomorphic robot hand based on our presented framework, which can not only adapt to different human givers but generalize to diverse novel objects with various shapes and sizes. The generalizability, reliability, and robustness of our method are demonstrated on 15 different novel objects with arbitrary handover poses from frontal and lateral positions, a system ablation study, a grasp planner comparison, and a user study on 6 participants delivering 15 objects from two benchmark sets.
WOS关键词MOVEMENT PRIMITIVES ; GRASP
资助项目National Natural Science Foundation of China[91748131] ; National Natural Science Foundation of China[62006229] ; National Natural Science Foundation of China[61771471] ; Strategic Priority Research Program of Chinese Academy of Science[XDB32050106] ; InnoHK Project
WOS研究方向Computer Science ; Robotics ; Neurosciences & Neurology
语种英语
出版者IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
WOS记录号WOS:001089186500020
资助机构National Natural Science Foundation of China ; Strategic Priority Research Program of Chinese Academy of Science ; InnoHK Project
源URL[http://ir.ia.ac.cn/handle/173211/54388]  
专题多模态人工智能系统全国重点实验室
智能机器人系统研究
通讯作者Wang, Peng
作者单位1.Chinese Acad Sci, Hong Kong Inst Sci & Innovat, Ctr Artificial Intelligence & Robot, Hong Kong, Peoples R China
2.Chinese Acad Sci, CAS Ctr Excellence Brain Sci & Intelligence Techn, Shanghai 200031, Peoples R China
3.Univ Chinese Acad Sci, Sch Artificial Intelligence, Beijing 100049, Peoples R China
4.Chinese Acad Sci, Inst Automat, Beijing 100190, Peoples R China
推荐引用方式
GB/T 7714
Duan, Haonan,Wang, Peng,Li, Yiming,et al. Learning Human-to-Robot Dexterous Handovers for Anthropomorphic Hand[J]. IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS,2023,15(3):1224-1238.
APA Duan, Haonan,Wang, Peng,Li, Yiming,Li, Daheng,&Wei, Wei.(2023).Learning Human-to-Robot Dexterous Handovers for Anthropomorphic Hand.IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS,15(3),1224-1238.
MLA Duan, Haonan,et al."Learning Human-to-Robot Dexterous Handovers for Anthropomorphic Hand".IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS 15.3(2023):1224-1238.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。