中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
WP-SGD: Weighted parallel SGD for distributed unbalanced-workload training system

文献类型:期刊论文

作者Cheng Daning1,2; Li Shigang1,3; Zhang Yunquan1
刊名JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING
出版日期2020-11-01
卷号145页码:202-216
关键词SGD Unbalanced workload SimuParallel SGD Distributed system
ISSN号0743-7315
DOI10.1016/j.jpdc.2020.06.011
英文摘要Stochastic gradient descent (SGD) is a popular stochastic optimization method in machine learning. Traditional parallel SGD algorithms, e.g., SimuParallel SGD (Zinkevich, 2010), often require all nodes to have the same performance or to consume equal quantities of data. However, these requirements are difficult to satisfy when the parallel SGD algorithms run in a heterogeneous computing environment; low-performance nodes will exert a negative influence on the final result. In this paper, we propose an algorithm called weighted parallel SGD (WP-SGD). WP-SGD combines weighted model parameters from different nodes in the system to produce the final output. WP-SGD makes use of the reduction in standard deviation to compensate for the loss from the inconsistency in performance of nodes in the cluster, which means that WP-SGD does not require that all nodes consume equal quantities of data. We also propose the methods of running two other parallel SGD algorithms combined with WP-SGD in a heterogeneous environment. The experimental results show that WP-SGD significantly outperforms the traditional parallel SGD algorithms on distributed training systems with an unbalanced workload. (C) 2020 Elsevier Inc. All rights reserved.
资助项目National Key R&D Program of China[2016YFB0200803] ; National Key R&D Program of China[2017YFB0202302] ; National Key R&D Program of China[2017YFB0202001] ; National Key R&D Program of China[2017YFB0202502] ; National Key R&D Program of China[2017YFB0202105] ; National Key R&D Program of China[2018YFB0704002] ; National Key R&D Program of China[2018YFC0809306] ; Strategic Priority Research Program of Chinese Academy of Sciences[XDC01000000] ; National Natural Science Foundation of China[61972376] ; National Natural Science Foundation of China[61502450] ; National Natural Science Foundation of China[61432018] ; National Natural Science Foundation of China[61521092] ; Science Foundation of Beijing[L182053] ; SKL of Computer Architecture Foundation[CARCH3504]
WOS研究方向Computer Science
语种英语
WOS记录号WOS:000568803300015
出版者ACADEMIC PRESS INC ELSEVIER SCIENCE
源URL[http://119.78.100.204/handle/2XEOYT63/15556]  
专题中国科学院计算技术研究所期刊论文_英文
通讯作者Li Shigang
作者单位1.Chinese Acad Sci, Inst Comp Technol, SKL Comp Architecture, Beijing, Peoples R China
2.Univ Chinese Acad Sci, Beijing, Peoples R China
3.Swiss Fed Inst Technol, Dept Comp Sci, Zurich, Switzerland
推荐引用方式
GB/T 7714
Cheng Daning,Li Shigang,Zhang Yunquan. WP-SGD: Weighted parallel SGD for distributed unbalanced-workload training system[J]. JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING,2020,145:202-216.
APA Cheng Daning,Li Shigang,&Zhang Yunquan.(2020).WP-SGD: Weighted parallel SGD for distributed unbalanced-workload training system.JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING,145,202-216.
MLA Cheng Daning,et al."WP-SGD: Weighted parallel SGD for distributed unbalanced-workload training system".JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING 145(2020):202-216.

入库方式: OAI收割

来源:计算技术研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。