中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Quantifying the generalization error in deep learning in terms of data distribution and neural network smoothness

文献类型:期刊论文

作者Jin, Pengzhan1,2; Lu, Lu3; Tang, Yifa1,2; Karniadakis, George Em3
刊名NEURAL NETWORKS
出版日期2020-10-01
卷号130页码:85-99
关键词Neural networks Generalization error Learnability Data distribution Cover complexity Neural network smoothness
ISSN号0893-6080
DOI10.1016/j.neunet.2020.06.024
英文摘要The accuracy of deep learning, i.e., deep neural networks, can be characterized by dividing the total error into three main types: approximation error, optimization error, and generalization error. Whereas there are some satisfactory answers to the problems of approximation and optimization, much is known about the theory of generalization. Most existing theoretical works for generalization to explain the performance of neural networks in practice. To derive a meaningful bound, we study the generalization error of neural networks for classification problems in terms of data distribution and neural network smoothness. We introduce the cover complexity (CC) to measure the difficulty learning a data set and the inverse of the modulus of continuity to quantify neural network smoothness. A quantitative bound for expected accuracy/error is derived by considering both the CC and neural network smoothness. Although most of the analysis is general and not specific to neural networks, we validate our theoretical assumptions and results numerically for neural networks by several data sets of images. The numerical results confirm that the expected error of trained networks scaled with the square root of the number of classes has a linear relationship with respect to the CC. We observe a clear consistency between test loss and neural network smoothness during the training process. In addition, we demonstrate empirically that the neural network smoothness decreases when the network size increases whereas the smoothness is insensitive to training dataset size. (C) 2020 Elsevier Ltd. All rights reserved.
资助项目DOE PhILMs project[de-sc0019453] ; AFOSR[FA9550-17-1-0013] ; Major Project on New Generation of Artificial Intelligence from MOST of China[2018AAA0101002] ; National Natural Science Foundation of China[11771438] ; DARPA AIRA grant[HR00111990025]
WOS研究方向Computer Science ; Neurosciences & Neurology
语种英语
WOS记录号WOS:000567813200009
出版者PERGAMON-ELSEVIER SCIENCE LTD
源URL[http://ir.amss.ac.cn/handle/2S8OKBNM/52160]  
专题中国科学院数学与系统科学研究院
通讯作者Karniadakis, George Em
作者单位1.Chinese Acad Sci, Acad Math & Syst Sci, ICMSEC, LSEC, Beijing 100190, Peoples R China
2.Univ Chinese Acad Sci, Sch Math Sci, Beijing 100049, Peoples R China
3.Brown Univ, Div Appl Math, Providence, RI 02912 USA
推荐引用方式
GB/T 7714
Jin, Pengzhan,Lu, Lu,Tang, Yifa,et al. Quantifying the generalization error in deep learning in terms of data distribution and neural network smoothness[J]. NEURAL NETWORKS,2020,130:85-99.
APA Jin, Pengzhan,Lu, Lu,Tang, Yifa,&Karniadakis, George Em.(2020).Quantifying the generalization error in deep learning in terms of data distribution and neural network smoothness.NEURAL NETWORKS,130,85-99.
MLA Jin, Pengzhan,et al."Quantifying the generalization error in deep learning in terms of data distribution and neural network smoothness".NEURAL NETWORKS 130(2020):85-99.

入库方式: OAI收割

来源:数学与系统科学研究院

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。