Nuclear Norm Regularized Least Squares Optimization on Grassmannian Manifolds

被引:0
|
作者
Liu, Yuanyuan [1 ]
Cheng, Hong [1 ]
Shang, Fanhua [2 ]
Cheng, James [2 ]
机构
[1] Chinese Univ Hong Kong, Dept Syst Engn & Engn Management, Hong Kong, Peoples R China
[2] Chinese Univ Hong Kong, Dept Comp Sci & Engn, Hong Kong, Peoples R China
来源
UNCERTAINTY IN ARTIFICIAL INTELLIGENCE | 2014年
关键词
MATRIX COMPLETION; ALGORITHMS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper aims to address a class of nuclear norm regularized least square (NNLS) problems. By exploiting the underlying low-rank matrix manifold structure, the problem with nuclear norm regularization is cast to a Riemannian optimization problem over matrix manifolds. Compared with existing NNLS algorithms involving singular value decomposition (SVD) of large-scale matrices, our method achieves significant reduction in computational complexity. Moreover, the uniqueness of matrix factorization can be guaranteed by our Grassmannian manifold method. In our solution, we first introduce the bilateral factorization into the original NNLS problem and convert it into a Grassmannian optimization problem by using a linearized technique. Then the conjugate gradient procedure on the Grassmannian manifold is developed for our method with a guarantee of local convergence. Finally, our method can be extended to address the graph regularized problem. Experimental results verified both the efficiency and effectiveness of our method.
引用
收藏
页码:515 / 524
页数:10
相关论文
共 45 条
  • [21] Some convergence results on the Regularized Alternating Least-Squares method for tensor decomposition
    Li, Na
    Kindermann, Stefan
    Navasca, Carmeliza
    LINEAR ALGEBRA AND ITS APPLICATIONS, 2013, 438 (02) : 796 - 812
  • [22] Least squares solution with the minimum-norm to general matrix equations via iteration
    Li, Zhao-Yan
    Wang, Yong
    Zhou, Bin
    Duan, Guang-Ren
    APPLIED MATHEMATICS AND COMPUTATION, 2010, 215 (10) : 3547 - 3562
  • [23] THE CONVERGENCE GUARANTEES OF A NON-CONVEX APPROACH FOR SPARSE RECOVERY USING REGULARIZED LEAST SQUARES
    Chen, Laming
    Gu, Yuantao
    2014 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2014,
  • [24] Interactive focus maps using least-squares optimization
    van Dijk, Thomas C.
    Haunert, Jan-Henrik
    INTERNATIONAL JOURNAL OF GEOGRAPHICAL INFORMATION SCIENCE, 2014, 28 (10) : 2052 - 2075
  • [25] Geometry of nonlinear least squares with applications to sloppy models and optimization
    Transtrum, Mark K.
    Machta, Benjamin B.
    Sethna, James P.
    PHYSICAL REVIEW E, 2011, 83 (03):
  • [26] Iterative Double Sketching for Faster Least-Squares Optimization
    Wang, Rui
    Ouyang, Yanyan
    Xu, Wangli
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [27] Least-Squares Temporal Difference Learning with Eligibility Traces based on Regularized Extreme Learning Machine
    Li, Dazi
    Li, Luntong
    Song, Tianheng
    Jin, Qibing
    PROCEEDINGS OF THE 28TH CHINESE CONTROL AND DECISION CONFERENCE (2016 CCDC), 2016, : 6976 - 6981
  • [28] Solving Partial Least Squares Regression via Manifold Optimization Approaches
    Chen, Haoran
    Sun, Yanfeng
    Gao, Junbin
    Hu, Yongli
    Yin, Baocai
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2019, 30 (02) : 588 - 600
  • [29] Performance evaluation of raspberry Pi platform for bioimpedance analysis using least squares optimization
    Freeborn, Todd J.
    PERSONAL AND UBIQUITOUS COMPUTING, 2019, 23 (02) : 279 - 285
  • [30] The minimal norm least squares Hermitian solution of the complex matrix equation AX B plus CX D = E
    Zhang, Fengxia
    Wei, Musheng
    Li, Ying
    Zhao, Jianli
    JOURNAL OF THE FRANKLIN INSTITUTE-ENGINEERING AND APPLIED MATHEMATICS, 2018, 355 (03): : 1296 - 1310