Large-Scale Affine Matrix Rank Minimization With a Novel Nonconvex Regularizer
文献类型:期刊论文
作者 | Wang, Zhi1; Liu, Yu2; Luo, Xin3,4![]() |
刊名 | IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS
![]() |
出版日期 | 2021-02-27 |
页码 | 15 |
关键词 | Minimization Convergence Tensors Optimization Analytical models Data models Data analysis Inexact proximal step low-rank minimization matrix completion novel nonconvex regularizer robust principal component analysis (RPCA) tensor completion |
ISSN号 | 2162-237X |
DOI | 10.1109/TNNLS.2021.3059711 |
通讯作者 | Wang, Jianjun(jjw@swu.edu.cn) ; Chen, Wu(chenwu@swu.edu.cn) |
英文摘要 | Low-rank minimization aims to recover a matrix of minimum rank subject to linear system constraint. It can be found in various data analysis and machine learning areas, such as recommender systems, video denoising, and signal processing. Nuclear norm minimization is a dominating approach to handle it. However, such a method ignores the difference among singular values of target matrix. To address this issue, nonconvex low-rank regularizers have been widely used. Unfortunately, existing methods suffer from different drawbacks, such as inefficiency and inaccuracy. To alleviate such problems, this article proposes a flexible model with a novel nonconvex regularizer. Such a model not only promotes low rankness but also can be solved much faster and more accurate. With it, the original low-rank problem can be equivalently transformed into the resulting optimization problem under the rank restricted isometry property (rank-RIP) condition. Subsequently, Nesterov's rule and inexact proximal strategies are adopted to achieve a novel algorithm highly efficient in solving this problem at a convergence rate of O(1/K), with K being the iterate count. Besides, the asymptotic convergence rate is also analyzed rigorously by adopting the Kurdyka-Lojasiewicz (KL) inequality. Furthermore, we apply the proposed optimization model to typical low-rank problems, including matrix completion, robust principal component analysis (RPCA), and tensor completion. Exhaustively empirical studies regarding data analysis tasks, i.e., synthetic data analysis, image recovery, personalized recommendation, and background subtraction, indicate that the proposed model outperforms state-of-the-art models in both accuracy and efficiency. |
资助项目 | National Natural Science Foundation of China[61673015] ; National Natural Science Foundation of China[61976181] ; National Natural Science Foundation of China[11931015] ; National Natural Science Foundation of China[61971296] ; National Natural Science Foundation of China[U19A2078] ; Fundamental Research Funds for the Central Universities[XDJK2019B063] ; Fundamental Research Funds for the Central Universities[SWU120036] ; Natural Science Foundation of Chongqing (China)[cstc2019jcyjjqX0013] ; Chongqing Research Program of Technology Innovation and Application[cstc2019jscx-fxydX0024] ; Chongqing Research Program of Technology Innovation and Application[cstc2019jscx-fxydX0027] ; Chongqing Research Program of Technology Innovation and Application[cstc2018jszx-cyzdX0041] ; CAAI-Huawei MindSpore Open Fund[CAAIXSJLJJ-2020-004B] ; Pioneer Hundred Talents Program of Chinese Academy of Sciences |
WOS研究方向 | Computer Science ; Engineering |
语种 | 英语 |
WOS记录号 | WOS:000733524900001 |
出版者 | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC |
源URL | [http://119.78.100.138/handle/2HOD01W0/14743] ![]() |
专题 | 中国科学院重庆绿色智能技术研究院 |
通讯作者 | Wang, Jianjun; Chen, Wu |
作者单位 | 1.Southwest Univ, Coll Comp & Informat Sci, Chongqing 400715, Peoples R China 2.Univ Adelaide, Sch Comp Sci, Adelaide, SA 5005, Australia 3.Chinese Acad Sci, Chongqing Engn Res Ctr Big Data Applicat Smart Ci, Chongqing 400714, Peoples R China 4.Chinese Acad Sci, Chongqing Inst Green & Intelligent Technol, Chongqing Key Lab Big Data & Intelligent Comp, Chongqing 400714, Peoples R China 5.Southwest Univ, Sch Math & Stat, Chongqing 400715, Peoples R China 6.Sichuan Univ, Coll Comp Sci, Chengdu 610065, Peoples R China |
推荐引用方式 GB/T 7714 | Wang, Zhi,Liu, Yu,Luo, Xin,et al. Large-Scale Affine Matrix Rank Minimization With a Novel Nonconvex Regularizer[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,2021:15. |
APA | Wang, Zhi.,Liu, Yu.,Luo, Xin.,Wang, Jianjun.,Gao, Chao.,...&Chen, Wu.(2021).Large-Scale Affine Matrix Rank Minimization With a Novel Nonconvex Regularizer.IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS,15. |
MLA | Wang, Zhi,et al."Large-Scale Affine Matrix Rank Minimization With a Novel Nonconvex Regularizer".IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS (2021):15. |
入库方式: OAI收割
来源:重庆绿色智能技术研究院
浏览0
下载0
收藏0
其他版本
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。