A proportional-integral-derivative-incorporated stochastic gradient descent-based latent factor analysis model
文献类型:期刊论文
作者 | Li, Jinli3; Yuan, Ye4,5,6![]() ![]() |
刊名 | NEUROCOMPUTING
![]() |
出版日期 | 2021-02-28 |
卷号 | 427页码:29-39 |
关键词 | Big data Stochastic gradient descent Proportional integral derivation PID controller High-dimensional and sparse matrix Latent factor analysis |
ISSN号 | 0925-2312 |
DOI | 10.1016/j.neucom.2020.11.029 |
通讯作者 | Luo, Xin(luoxin21@gmail.com) |
英文摘要 | Large-scale relationships like user-item preferences in a recommender system are mostly described by a high-dimensional and sparse (HiDS) matrix. A latent factor analysis (LFA) model extracts useful knowledge from an HiDS matrix efficiently, where stochastic gradient descent (SGD) is frequently adopted as the learning algorithm. However, a standard SGD algorithm updates a decision parameter with the stochastic gradient on the instant loss only, without considering information described by prior updates. Hence, an SGD-based LFA model commonly consumes many iterations to converge, which greatly affects its practicability. On the other hand, a proportional-integral-derivative (PID) controller makes a learning model converge fast with the consideration of its historical errors from the initial state till the current moment. Motivated by this discovery, this paper proposes a PID-incorporated SGD-based LFA (PSL) model. Its main idea is to rebuild the instant error on a single instance following the principle of PID, and then substitute this rebuilt error into an SGD algorithm for accelerating model convergence. Empirical studies on six widely-accepted HiDS matrices indicate that compared with state-of-the-art LFA models, a PSL model achieves significantly higher computational efficiency as well as highly competitive prediction accuracy for missing data of an HiDS matrix. (c) 2020 Elsevier B.V. All rights reserved. |
资助项目 | National Natural Science Foundation of China[61772493] ; Guangdong Province Universities and College Pearl River Scholar Funded Scheme (2019) ; Natural Science Foundation of Chongqing (China)[cstc2019jcyjjqX0013] |
WOS研究方向 | Computer Science |
语种 | 英语 |
WOS记录号 | WOS:000611067800003 |
出版者 | ELSEVIER |
源URL | [http://119.78.100.138/handle/2HOD01W0/12799] ![]() |
专题 | 中国科学院重庆绿色智能技术研究院 |
通讯作者 | Luo, Xin |
作者单位 | 1.China Patent Informat Ctr, Beijing 100088, Peoples R China 2.Chinese Acad Sci, Chongqing Inst Green & Intelligent Technol, Chongqing 400714, Peoples R China 3.Dongguan Univ Technol, Sch Comp Sci & Technol, Dongguan 523808, Guangdong, Peoples R China 4.Chinese Acad Sci, Chongqing Engn Res Ctr Big Data Applicat Smart Ci, Chongqing 400714, Peoples R China 5.Chinese Acad Sci, Chongqing Key Lab Big Data & Intelligent Comp, Chongqing Inst Green & Intelligent Technol, Chongqing 400714, Peoples R China 6.Univ Chinese Acad Sci, Beijing 100049, Peoples R China 7.Beihang Univ, Sch Cyber Sci & Technol, Beijing 100191, Peoples R China |
推荐引用方式 GB/T 7714 | Li, Jinli,Yuan, Ye,Ruan, Tao,et al. A proportional-integral-derivative-incorporated stochastic gradient descent-based latent factor analysis model[J]. NEUROCOMPUTING,2021,427:29-39. |
APA | Li, Jinli,Yuan, Ye,Ruan, Tao,Chen, Jia,&Luo, Xin.(2021).A proportional-integral-derivative-incorporated stochastic gradient descent-based latent factor analysis model.NEUROCOMPUTING,427,29-39. |
MLA | Li, Jinli,et al."A proportional-integral-derivative-incorporated stochastic gradient descent-based latent factor analysis model".NEUROCOMPUTING 427(2021):29-39. |
入库方式: OAI收割
来源:重庆绿色智能技术研究院
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。