Rectified Exponential Units for Convolutional Neural Networks
文献类型:期刊论文
作者 | Ying, Yao1; Su, Jianlin2; Shan, Peng1; Miao, Ligang3; Wang, Xiaolian4![]() ![]() |
刊名 | IEEE ACCESS
![]() |
出版日期 | 2019 |
卷号 | 7页码:101633-101640 |
关键词 | Activation function convolutional neural network rectified exponential unit parametric rectified exponential unit |
ISSN号 | 2169-3536 |
DOI | 10.1109/ACCESS.2019.2928442 |
通讯作者 | Shan, Peng(peng.shan@neuq.edu.cn) |
英文摘要 | Rectified linear unit (ReLU) plays an important role in today's convolutional neural networks (CNNs). In this paper, we propose a novel activation function called Rectified Exponential Unit (REU). Inspired by two recently proposed activation functions: Exponential Linear Unit (ELU) and Swish, the REU is designed by introducing the advantage of flexible exponent and multiplication function form. Moreover, we propose the Parametric REU (PREU) to increase the expressive power of the REU. The experiments with three classical CNN architectures, LeNet-5, Network in Network, and Residual Network (ResNet) on scale-various benchmarks including Fashion-MNIST, CIFAR10, CIFAR100, and Tiny ImageNet demonstrate that REU and PREU achieve improvement compared with other activation functions. Our results show that our REU has relative error improvements over ReLU of 7.74% and 6.08% on CIFAR-10 and 100 with the ResNet, the improvements of PREU is 9.24% and 9.32%. Finally, we use the different PREU variants in the Residual unit to achieve more stable results. |
资助项目 | National Natural Science Foundation of China[61601104] ; Natural Science Foundation of Hebei Province[F2017501052] |
WOS研究方向 | Computer Science ; Engineering ; Telecommunications |
语种 | 英语 |
WOS记录号 | WOS:000481688500091 |
出版者 | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC |
资助机构 | National Natural Science Foundation of China ; Natural Science Foundation of Hebei Province |
源URL | [http://ir.ia.ac.cn/handle/173211/27574] ![]() |
专题 | 自动化研究所_智能制造技术与系统研究中心_多维数据分析团队 |
通讯作者 | Shan, Peng |
作者单位 | 1.Northeastern Univ, Coll Informat Sci & Engn, Shenyang 110819, Liaoning, Peoples R China 2.Sun Yat Sen Univ, Sch Math, Guangzhou 510220, Guangdong, Peoples R China 3.Northeastern Univ, Sch Comp & Commun Engn, Shenyang 110819, Liaoning, Peoples R China 4.Chinese Acad Sci, Inst Automat, Beijing 100190, Peoples R China |
推荐引用方式 GB/T 7714 | Ying, Yao,Su, Jianlin,Shan, Peng,et al. Rectified Exponential Units for Convolutional Neural Networks[J]. IEEE ACCESS,2019,7:101633-101640. |
APA | Ying, Yao,Su, Jianlin,Shan, Peng,Miao, Ligang,Wang, Xiaolian,&Peng, Silong.(2019).Rectified Exponential Units for Convolutional Neural Networks.IEEE ACCESS,7,101633-101640. |
MLA | Ying, Yao,et al."Rectified Exponential Units for Convolutional Neural Networks".IEEE ACCESS 7(2019):101633-101640. |
入库方式: OAI收割
来源:自动化研究所
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。