Switch Unit (SU): A Novel Type of Unit for the Activation Function
文献类型:会议论文
作者 | Han Z(韩志)1![]() ![]() ![]() ![]() |
出版日期 | 2018 |
会议日期 | July 19-23, 2018 |
会议地点 | Tianjin, China |
关键词 | Switch unit Neural networks CIFAR10/100 |
页码 | 154-157 |
英文摘要 | The paper presents a novel unit - switch unit (SU). This unit represents a conditional include function, which is able to map the input from one dimensional space to two dimensional spaces depending on whether the input is positive or not. In the traditional neural networks, there always exist one of derivatives of two entries from output that refers to input as one and the other as zero. The proposed unit is able to cope with the vanishing gradient issues induced by the factor of activation function such as ReLUs (rectified linear units). In the experiments, different neural networks are trained based on the proposed units and tested on CIFAR-10 and CIFAR-100 datasets, which yielded the comparable results to ReLUs based neural networks using the same parameters. |
源文献作者 | IEEE Robotics & Automation Society |
产权排序 | 1 |
会议录 | Proceedings of 2018 IEEE 8th Annual International Conference on CYBER Technology in Automation, Control, and Intelligent Systems
![]() |
会议录出版者 | IEEE |
会议录出版地 | New York |
语种 | 英语 |
ISBN号 | 978-1-5386-7056-9 |
源URL | [http://ir.sia.cn/handle/173321/23860] ![]() |
专题 | 沈阳自动化研究所_机器人学研究室 |
通讯作者 | Han JD(韩建达) |
作者单位 | 1.State Key Laboratory of Robotics,
Shenyang Institute of Automation,
Chinese Academy of Sciences, Shenyang 110016, China 2.University of Chinese Academy of Sciences, Beijing 100049, China |
推荐引用方式 GB/T 7714 | Han Z,Han JD,Yang XY,et al. Switch Unit (SU): A Novel Type of Unit for the Activation Function[C]. 见:. Tianjin, China. July 19-23, 2018. |
入库方式: OAI收割
来源:沈阳自动化研究所
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。