中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
A CGRA based Neural Network Inference Engine for Deep Reinforcement Learning

文献类型:会议论文

作者Minglan Liang; Mingsong Chen; Zheng Wang
出版日期2018
会议日期2018
会议地点中国成都
英文摘要Recent ultra-fast development of artificial intelligence algorithms has demanded dedicated neural network accelerators, whose high computing performance and low power consumption enable the deployment of deep learning algorithms on the edge computing nodes. State-of-the-art deep learning engines mostly support supervised learning such as CNN, RNN, whereas very few AI engines support on-chip reinforcement learning, which is the foremost algorithm kernel for decision-making subsystem of an autonomous system. In this work, a Coarse-grained Reconfigurable Array (CGRA) like AI computing engine has been designed for the deployments of both supervised and reinforcement learning. Logic synthesis at the design frequency of 200MHz based on 65nm CMOS technology reveals the physical statistics of the proposed engine of 0.32$mm^2$ in silicon area, 15.45 mW in power consumption. The proposed on-chip AI engine facilitates the implementation of end-to-end perceptual and decision-making networks, which can find its wide employment in autonomous driving, robotics and UAVs.
URL标识查看原文
源URL[http://ir.siat.ac.cn:8080/handle/172644/13736]  
专题深圳先进技术研究院_集成所
推荐引用方式
GB/T 7714
Minglan Liang,Mingsong Chen,Zheng Wang. A CGRA based Neural Network Inference Engine for Deep Reinforcement Learning[C]. 见:. 中国成都. 2018.

入库方式: OAI收割

来源:深圳先进技术研究院

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。