中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Reading selectively via Binary Input Gated Recurrent Unit

文献类型:会议论文

作者Li Z(李哲); Wang PS(王培松); Lu HQ(卢汉清); Cheng J(程健)
出版日期2019-08
会议日期2019-08
会议地点中国澳门
英文摘要

Recurrent Neural Networks (RNNs) have shown great promise in sequence modeling tasks. Gated Recurrent Unit (GRU) is one of the most used recurrent structures, which makes a good trade-off between performance and time spent. However, its practical implementation based on soft gates only partially achieves the goal to control information flow. We can hardly explain what the network has learnt internally. Inspired by human reading, we introduce binary input gated recurrent unit (BIGRU), a GRU based model using a binary input gate instead of the reset gate in GRU. By doing so, our model can read selectively during interference. In our experiments, we show that BIGRU mainly ignores the conjunctions, adverbs and articles that do not make a big difference to the document understanding, which is meaningful for us to further understand how the network works. In addition, due to reduced interference from redundant information, our model achieves better performances than baseline GRU in all the testing tasks.

语种英语
源URL[http://ir.ia.ac.cn/handle/173211/23693]  
专题自动化研究所_模式识别国家重点实验室_图像与视频分析团队
通讯作者Li Z(李哲)
作者单位中国科学院自动化研究所
推荐引用方式
GB/T 7714
Li Z,Wang PS,Lu HQ,et al. Reading selectively via Binary Input Gated Recurrent Unit[C]. 见:. 中国澳门. 2019-08.

入库方式: OAI收割

来源:自动化研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。