Nuclear Norm Regularized Least Squares Optimization on Grassmannian Manifolds

被引:0
|
作者
Liu, Yuanyuan [1 ]
Cheng, Hong [1 ]
Shang, Fanhua [2 ]
Cheng, James [2 ]
机构
[1] Chinese Univ Hong Kong, Dept Syst Engn & Engn Management, Hong Kong, Peoples R China
[2] Chinese Univ Hong Kong, Dept Comp Sci & Engn, Hong Kong, Peoples R China
来源
UNCERTAINTY IN ARTIFICIAL INTELLIGENCE | 2014年
关键词
MATRIX COMPLETION; ALGORITHMS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper aims to address a class of nuclear norm regularized least square (NNLS) problems. By exploiting the underlying low-rank matrix manifold structure, the problem with nuclear norm regularization is cast to a Riemannian optimization problem over matrix manifolds. Compared with existing NNLS algorithms involving singular value decomposition (SVD) of large-scale matrices, our method achieves significant reduction in computational complexity. Moreover, the uniqueness of matrix factorization can be guaranteed by our Grassmannian manifold method. In our solution, we first introduce the bilateral factorization into the original NNLS problem and convert it into a Grassmannian optimization problem by using a linearized technique. Then the conjugate gradient procedure on the Grassmannian manifold is developed for our method with a guarantee of local convergence. Finally, our method can be extended to address the graph regularized problem. Experimental results verified both the efficiency and effectiveness of our method.
引用
收藏
页码:515 / 524
页数:10
相关论文
共 45 条
  • [1] AN ACCELERATED PROXIMAL GRADIENT ALGORITHM FOR NUCLEAR NORM REGULARIZED LINEAR LEAST SQUARES PROBLEMS
    Toh, Kim-Chuan
    Yun, Sangwoon
    PACIFIC JOURNAL OF OPTIMIZATION, 2010, 6 (03): : 615 - 640
  • [2] Linear Convergence of Iteratively Reweighted Least Squares for Nuclear Norm Minimization
    Kummerle, Christian
    Stoeger, Dominik
    2024 IEEE 13RD SENSOR ARRAY AND MULTICHANNEL SIGNAL PROCESSING WORKSHOP, SAM 2024, 2024,
  • [3] Distributed Learning with Regularized Least Squares
    Lin, Shao-Bo
    Guo, Xin
    Zhou, Ding-Xuan
    JOURNAL OF MACHINE LEARNING RESEARCH, 2017, 18
  • [4] Optimality of regularized least squares ranking with imperfect kernels
    He, Fangchao
    Zeng, Yu
    Zheng, Lie
    Wu, Qiang
    INFORMATION SCIENCES, 2022, 589 : 564 - 579
  • [5] Bayesian l0-regularized least squares
    Polson, Nicholas G.
    Sun, Lei
    APPLIED STOCHASTIC MODELS IN BUSINESS AND INDUSTRY, 2019, 35 (03) : 717 - 731
  • [6] Online regularized pairwise learning with least squares loss
    Wang, Cheng
    Hu, Ting
    ANALYSIS AND APPLICATIONS, 2020, 18 (01) : 49 - 78
  • [7] Regularized and Structured Tensor Total Least Squares Methods with Applications
    Han, Feiyang
    Wei, Yimin
    Xie, Pengpeng
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2024, 202 (03) : 1101 - 1136
  • [8] Faster SVD-Truncated Regularized Least-Squares
    Boutsidis, Christos
    Magdon-Ismail, Malik
    2014 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2014, : 1321 - 1325
  • [9] Early-Stopping Regularized Least-Squares Classification
    Li, Wenye
    ADVANCES IN NEURAL NETWORKS - ISNN 2014, 2014, 8866 : 278 - 285
  • [10] Analysis of regularized least squares ranking with centered reproducing kernel
    He, Fangchao
    Zheng, Lie
    INTERNATIONAL JOURNAL OF WAVELETS MULTIRESOLUTION AND INFORMATION PROCESSING, 2024, 22 (03)