Large-Scale and Scalable Latent Factor Analysis via Distributed Alternative Stochastic Gradient Descent for Recommender Systems
文献类型:期刊论文
作者 | Shi, Xiaoyu2,3![]() ![]() ![]() |
刊名 | IEEE TRANSACTIONS ON BIG DATA
![]() |
出版日期 | 2022-04-01 |
卷号 | 8期号:2页码:420-431 |
关键词 | Recommender systems Training Optimization Big Data Cloud computing Computational modeling Sparse matrices Recommender system latent factor analysis high-dimensional and sparse matrices alternative stochastic gradient descent distributed computing |
ISSN号 | 2332-7790 |
DOI | 10.1109/TBDATA.2020.2973141 |
通讯作者 | Luo, Xin(luoxin21@cigit.ac.cn) |
英文摘要 | Latent factor analysis (LFA) via stochastic gradient descent (SGD) is highly efficient in discovering user and item patterns from high-dimensional and sparse (HiDS) matrices from recommender systems. However, most LFA-based recommender systems adopt a standard SGD algorithm, which suffers limited scalability when addressing big data. On the other hand, most existing parallel SGD solvers are either under the memory-sharing framework designed for a bare machine or suffering high communicational costs, which also greatly limits their applications in large-scale systems. To address the above issues, this article proposes a distributed alternative stochastic gradient descent (DASGD) solver for an LFA-based recommender. Its training-dependences among latent features are decoupled via alternatively fixing one-half of the features to learn the other half following the principle of SGD but in parallel. It's distribution mechanism consists of efficient data partition, allocation and task parallelization strategies, which greatly reduces its communicational cost for high scalability. Experimental results on three large-scale HiDS matrices generated by real-world applications demonstrate that the proposed DASGD algorithm outperforms state-of-the-art distributed SGD solvers for recommender systems in terms of prediction accuracy as well as scalability. Hence, it is highly useful for training LFA-based recommenders on large scale HiDS matrices with the help of cloud computing facilities. |
资助项目 | National Natural Science Foundation of China[61602434] ; National Natural Science Foundation of China[61772493] ; National Natural Science Foundation of China[91646114] ; National Natural Science Foundation of China[61702475] ; Natural Science Foundation ofChongqing (China)[cstc2019jcyjjqX0013] ; Chongqing Research Program of Technology Innovation and Application[cstc2019jscxzdztzxX0019] ; Chongqing Research Program of Technology Innovation and Application[cstc2018jszx-cyztzxX0025] ; Chongqing research program of key standard technologies innovation of key industries[cstc2017zdcy-zdyfX0076] ; Youth Innovation Promotion Association CAS[2017393] ; Pioneer Hundred Talents Program of Chinese Academy of Sciences |
WOS研究方向 | Computer Science |
语种 | 英语 |
WOS记录号 | WOS:000767848400009 |
出版者 | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC |
源URL | [http://119.78.100.138/handle/2HOD01W0/15381] ![]() |
专题 | 中国科学院重庆绿色智能技术研究院 |
通讯作者 | Luo, Xin |
作者单位 | 1.Hong Kong Polytech Univ, Dept Comp, Kowloon, Hong Kong 999077, Peoples R China 2.Chinese Acad Sci, Chongqing Engn Res Ctr Big Data Applicat Smart Ci, Chongqing 400714, Peoples R China 3.Chinese Acad Sci, Chongqing Key Lab Big Data & Intelligent Comp, Chongqing Inst Green & Intelligent Technol, Chongqing 400714, Peoples R China 4.Hengrui Chongqing Artificial Intelligence Res Ctr, Dept Big Data Analyses Techn, Chongqing 401331, Peoples R China 5.Swinburne Univ Technol, Sch Software & Elect Engn, Melbourne, Vic 3122, Australia |
推荐引用方式 GB/T 7714 | Shi, Xiaoyu,He, Qiang,Luo, Xin,et al. Large-Scale and Scalable Latent Factor Analysis via Distributed Alternative Stochastic Gradient Descent for Recommender Systems[J]. IEEE TRANSACTIONS ON BIG DATA,2022,8(2):420-431. |
APA | Shi, Xiaoyu,He, Qiang,Luo, Xin,Bai, Yanan,&Shang, Mingsheng.(2022).Large-Scale and Scalable Latent Factor Analysis via Distributed Alternative Stochastic Gradient Descent for Recommender Systems.IEEE TRANSACTIONS ON BIG DATA,8(2),420-431. |
MLA | Shi, Xiaoyu,et al."Large-Scale and Scalable Latent Factor Analysis via Distributed Alternative Stochastic Gradient Descent for Recommender Systems".IEEE TRANSACTIONS ON BIG DATA 8.2(2022):420-431. |
入库方式: OAI收割
来源:重庆绿色智能技术研究院
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。