中国科学院机构知识库网格
Chinese Academy of Sciences Institutional Repositories Grid
Why Dataset Properties Bound the Scalability of Parallel Machine Learning Training Algorithms

文献类型:期刊论文

作者Cheng, Daning1; Li, Shigang2; Zhang, Hanping3; Xia, Fen3; Zhang, Yunquan4
刊名IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS
出版日期2021-07-01
卷号32期号:7页码:1702-1712
关键词Training Scalability Machine learning Machine learning algorithms Stochastic processes Task analysis Upper bound Parallel training algorithms training dataset scalability stochastic optimization methods
ISSN号1045-9219
DOI10.1109/TPDS.2020.3048836
英文摘要As the training dataset size and the model size of machine learning increase rapidly, more computing resources are consumed to speedup the training process. However, the scalability and performance reproducibility of parallel machine learning training, which mainly uses stochastic optimization algorithms, are limited. In this paper, we demonstrate that the sample difference in the dataset plays a prominent role in the scalability of parallel machine learning algorithms. We propose to use statistical properties of dataset to measure sample differences. These properties include the variance of sample features, sample sparsity, sample diversity, and similarity in sampling sequences. We choose four types of parallel training algorithms as our research objects: (1) the asynchronous parallel SGD algorithm (Hogwild! algorithm), (2) the parallel model average SGD algorithm (minibatch SGD algorithm), (3) the decentralization optimization algorithm, and (4) the dual coordinate optimization (DADM algorithm). Our results show that the statistical properties of training datasets determine the scalability upper bound of these parallel training algorithms.
资助项目National Natural Science Foundation of China[61972376] ; National Natural Science Foundation of China[61502450] ; National Natural Science Foundation of China[61432018] ; National Natural Science Foundation of China[61521092] ; National Key Research and Development Program of China[2016YFB0200800] ; National Key Research and Development Program of China[2016YFB0200803] ; National Key Research and Development Program of China[2017YFB0202302] ; National Key Research and Development Program of China[2017YFB0202105] ; State Key Laboratory of Computer Architecture Foundation[CARCH3504] ; Natural Science Foundation of Beijing[L182053]
WOS研究方向Computer Science ; Engineering
语种英语
WOS记录号WOS:000621405200017
出版者IEEE COMPUTER SOC
源URL[http://119.78.100.204/handle/2XEOYT63/16910]  
专题中国科学院计算技术研究所期刊论文_英文
通讯作者Zhang, Yunquan
作者单位1.Chinese Acad Sci, Inst Comp Technol, SKL, Beijing, Peoples R China
2.Swiss Fed Inst Technol, Dept Comp Sci, Zh, Switzerland
3.Beijing Wisdom Uranium Technol Co Ltd, Algorithm Dept, Beijing, Peoples R China
4.Chinese Acad Sci, Inst Comp Technol, SKL Comp Architecture, Beijing, Peoples R China
推荐引用方式
GB/T 7714
Cheng, Daning,Li, Shigang,Zhang, Hanping,et al. Why Dataset Properties Bound the Scalability of Parallel Machine Learning Training Algorithms[J]. IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS,2021,32(7):1702-1712.
APA Cheng, Daning,Li, Shigang,Zhang, Hanping,Xia, Fen,&Zhang, Yunquan.(2021).Why Dataset Properties Bound the Scalability of Parallel Machine Learning Training Algorithms.IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS,32(7),1702-1712.
MLA Cheng, Daning,et al."Why Dataset Properties Bound the Scalability of Parallel Machine Learning Training Algorithms".IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS 32.7(2021):1702-1712.

入库方式: OAI收割

来源:计算技术研究所

浏览0
下载0
收藏0
其他版本

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。