中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Meta-Imitation Learning by Watching Video Demonstrations

文献类型:会议论文

作者Li, Jiayi1,2; Lu, Tao1; Cao, Xiaoge1,2; Cai, Yinghao1; Wang, Shuo1,2,3
出版日期2022-05
会议日期2022.4.25-2022.4.29
会议地点线上
英文摘要

Meta-Imitation Learning is a promising technique for the robot to learn a new task from observing one or a few human demonstrations. However, it usually requires a significant number of demonstrations both from humans and robots during the meta-training phase, which is a laborious and hard work for data collection, especially in recording the actions and specifying the correspondence between human and robot. In this work, we present an approach of meta-imitation learning by watching video demonstrations from humans. In comparison to prior works, our approach is able to translate human videos into practical robot demonstrations and train the meta-policy with adaptive loss based on the quality of the translated data. Our approach relies only on human videos and does not require robot demonstration, which facilitates data collection and is more in line with human imitation behavior. Experiments reveal that our method achieves the comparable performance to the baseline on fast learning a set of vision-based tasks through watching a single video demonstration.

语种英语
源URL[http://ir.ia.ac.cn/handle/173211/48539]  
专题智能机器人系统研究
通讯作者Lu, Tao
作者单位1.State Key Laboratory of Management and Control for Complex Systems, Institute of Automation, Chinese Academy of Sciences
2.School of Artificial Intelligence, University of Chinese Academy of Sciences
3.Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences
推荐引用方式
GB/T 7714
Li, Jiayi,Lu, Tao,Cao, Xiaoge,et al. Meta-Imitation Learning by Watching Video Demonstrations[C]. 见:. 线上. 2022.4.25-2022.4.29.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。