Deterministic convergence of an online gradient method for neural networks
文献类型:期刊论文
作者 | Wu, W; Xu, YS |
刊名 | JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS
![]() |
出版日期 | 2002-07-01 |
卷号 | 144期号:1-2页码:335-347 |
关键词 | online stochastic gradient method nonlinear feedforward neural networks deterministic convergence monotonicity constant learning rate |
ISSN号 | 0377-0427 |
英文摘要 | The online gradient method has been widely used as a learning algorithm for neural networks. We establish a deterministic convergence of online gradient methods for the training of a class of nonlinear feedforward neural networks when the training examples are linearly independent. We choose the learning rate eta to be a constant during the training procedure. The monotonicity of the error function in the iteration is proved. A criterion for choosing the learning rate eta is also provided to guarantee the convergence. Under certain conditions similar to those for the classical gradient methods, an optimal convergence rate for our online gradient methods is proved. (C) 2001 Elsevier Science B.V. All rights reserved. |
WOS研究方向 | Mathematics |
语种 | 英语 |
WOS记录号 | WOS:000176295300025 |
出版者 | ELSEVIER SCIENCE BV |
源URL | [http://ir.amss.ac.cn/handle/2S8OKBNM/17735] ![]() |
专题 | 中国科学院数学与系统科学研究院 |
通讯作者 | Wu, W |
作者单位 | 1.Dalian Univ Technol, Dept Math, Dalian 116023, Peoples R China 2.N Dakota State Univ, Dept Math, Fargo, ND 58105 USA 3.Acad Sinica, Math Inst, Beijing 100080, Peoples R China |
推荐引用方式 GB/T 7714 | Wu, W,Xu, YS. Deterministic convergence of an online gradient method for neural networks[J]. JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS,2002,144(1-2):335-347. |
APA | Wu, W,&Xu, YS.(2002).Deterministic convergence of an online gradient method for neural networks.JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS,144(1-2),335-347. |
MLA | Wu, W,et al."Deterministic convergence of an online gradient method for neural networks".JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS 144.1-2(2002):335-347. |
入库方式: OAI收割
来源:数学与系统科学研究院
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。