中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Parallel Multistage Wide Neural Network

文献类型:期刊论文

作者Xi, Jiangbo6; Ersoy, Okan K.5; Fang, Jianwu4; Wu, Tianjun3; Wei, Xin2; Zhao, Chaoying1
刊名IEEE Transactions on Neural Networks and Learning Systems
关键词Ensemble learning incremental learning multistage wide learning parallel testing
ISSN号2162237X;21622388
DOI10.1109/TNNLS.2021.3120331
产权排序5
英文摘要

Deep learning networks have achieved great success in many areas, such as in large-scale image processing. They usually need large computing resources and time and process easy and hard samples inefficiently in the same way. Another undesirable problem is that the network generally needs to be retrained to learn new incoming data. Efforts have been made to reduce the computing resources and realize incremental learning by adjusting architectures, such as scalable effort classifiers, multi-grained cascade forest (gcForest), conditional deep learning (CDL), tree CNN, decision tree structure with knowledge transfer (ERDK), forest of decision trees with radial basis function (RBF) networks, and knowledge transfer (FDRK). In this article, a parallel multistage wide neural network (PMWNN) is presented. It is composed of multiple stages to classify different parts of data. First, a wide radial basis function (WRBF) network is designed to learn features efficiently in the wide direction. It can work on both vector and image instances and can be trained in one epoch using subsampling and least squares (LS). Second, successive stages of WRBF networks are combined to make up the PMWNN. Each stage focuses on the misclassified samples of the previous stage. It can stop growing at an early stage, and a stage can be added incrementally when new training data are acquired. Finally, the stages of the PMWNN can be tested in parallel, thus speeding up the testing process. To sum up, the proposed PMWNN network has the advantages of: 1) optimized computing resources; 2) incremental learning; and 3) parallel testing with stages. The experimental results with the MNIST data, a number of large hyperspectral remote sensing data, and different types of data in different application areas, including many image and nonimage datasets, show that the WRBF and PMWNN can work well on both image and nonimage data and have very competitive accuracy compared to learning models, such as stacked autoencoders, deep belief nets, support vector machine (SVM), multilayer perceptron (MLP), LeNet-5, RBF network, recently proposed CDL, broad learning, gcForest, ERDK, and FDRK. IEEE

语种英语
出版者Institute of Electrical and Electronics Engineers Inc.
源URL[http://ir.opt.ac.cn/handle/181661/95372]  
专题西安光学精密机械研究所_空间光学应用研究室
作者单位1.School of Geology Engineering and Geomatics, Chang'an University, Xi'an 710054, China.
2.Xi'an Institute of Optics and Precision Mechanics, Xi'an 710072, China.;
3.Department of Mathematics and Information Science, College of Science, Chang'an University, Xi'an 710064, China.;
4.School of Electronic and Control Engineering, Chang'an University, Xi'an 710064, china.;
5.School of Electrical and Computer Engineering, Purdue University, West Lafayette, IN 47907 USA (e-mail: ersoy@purdue.edu).;
6.School of Geology Engineering and Geomatics, Chang'an University, Xi'an 710054, China (e-mail: xijiangbo@chd.edu.cn).;
推荐引用方式
GB/T 7714
Xi, Jiangbo,Ersoy, Okan K.,Fang, Jianwu,et al. Parallel Multistage Wide Neural Network[J]. IEEE Transactions on Neural Networks and Learning Systems.
APA Xi, Jiangbo,Ersoy, Okan K.,Fang, Jianwu,Wu, Tianjun,Wei, Xin,&Zhao, Chaoying.
MLA Xi, Jiangbo,et al."Parallel Multistage Wide Neural Network".IEEE Transactions on Neural Networks and Learning Systems

入库方式: OAI收割

来源:西安光学精密机械研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。