Accelerating Minibatch Stochastic Gradient Descent Using Typicality Sampling
文献类型:期刊论文
作者 | Peng, Xinyu2; Li, Li1; Wang, Fei-Yue3![]() |
刊名 | IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
![]() |
出版日期 | 2020-11-01 |
卷号 | 31期号:11页码:4649-4659 |
关键词 | Training Convergence Approximation algorithms Stochastic processes Estimation Optimization Acceleration Batch selection machine learning minibatch stochastic gradient descent (SGD) speed of convergence |
ISSN号 | 2162-237X |
DOI | 10.1109/TNNLS.2019.2957003 |
通讯作者 | Li, Li(li-li@tsinghua.edu.cn) |
英文摘要 | Machine learning, especially deep neural networks, has developed rapidly in fields, including computer vision, speech recognition, and reinforcement learning. Although minibatch stochastic gradient descent (SGD) is one of the most popular stochastic optimization methods for training deep networks, it shows a slow convergence rate due to the large noise in the gradient approximation. In this article, we attempt to remedy this problem by building a more efficient batch selection method based on typicality sampling, which reduces the error of gradient estimation in conventional minibatch SGD. We analyze the convergence rate of the resulting typical batch SGD algorithm and compare the convergence properties between the minibatch SGD and the algorithm. Experimental results demonstrate that our batch selection scheme works well and more complex minibatch SGD variants can benefit from the proposed batch selection strategy. |
资助项目 | National Key Research and Development Program of China[2018AAA0101400] ; National Natural Science Foundation of China[91720000] ; Beijing Municipal Science and Technology Commission[Z181100008918007] |
WOS研究方向 | Computer Science ; Engineering |
语种 | 英语 |
WOS记录号 | WOS:000587699700019 |
出版者 | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC |
资助机构 | National Key Research and Development Program of China ; National Natural Science Foundation of China ; Beijing Municipal Science and Technology Commission |
源URL | [http://ir.ia.ac.cn/handle/173211/41732] ![]() |
专题 | 自动化研究所_复杂系统管理与控制国家重点实验室_先进控制与自动化团队 |
通讯作者 | Li, Li |
作者单位 | 1.Tsinghua Univ, Dept Automat, Tsinghua Natl Lab Informat Sci & Technol, Beijing 100084, Peoples R China 2.Tsinghua Univ, Dept Automat, Beijing 100084, Peoples R China 3.Chinese Acad Sci, Inst Automat, State Key Lab Management & Control Complex Syst, Beijing 100080, Peoples R China |
推荐引用方式 GB/T 7714 | Peng, Xinyu,Li, Li,Wang, Fei-Yue. Accelerating Minibatch Stochastic Gradient Descent Using Typicality Sampling[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,2020,31(11):4649-4659. |
APA | Peng, Xinyu,Li, Li,&Wang, Fei-Yue.(2020).Accelerating Minibatch Stochastic Gradient Descent Using Typicality Sampling.IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,31(11),4649-4659. |
MLA | Peng, Xinyu,et al."Accelerating Minibatch Stochastic Gradient Descent Using Typicality Sampling".IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 31.11(2020):4649-4659. |
入库方式: OAI收割
来源:自动化研究所
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。