Maximal residual coordinate descent method with k-means clustering for solving large linear least-squares problems

被引:0
作者
Rui, Liu [1 ]
Ai-Li, Yang [1 ,2 ]
Jian, Ma [1 ,2 ]
机构
[1] Hainan Normal Univ, Sch Math & Stat, Haikou 571158, Peoples R China
[2] Hainan Normal Univ, MOE Key Lab Data Sci & Intelligence Educ, Haikou 571158, Peoples R China
基金
中国国家自然科学基金;
关键词
Linear least-squares problem; Coordinate descent method; k-means clustering; Maximal residual; Convergence analysis; RANDOMIZED KACZMARZ METHOD; ALGORITHM;
D O I
10.1007/s11075-025-02036-6
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
For solving large-scale inconsistent linear systems with full column-rank coefficient matrices, we propose a maximum residual coordinate descent method integrated with k-means clustering, abbreviated as MRCD(k). The method begins by partitioning all equations (or equivalently, hyperplanes) into k clusters via k-means clustering, where the cosine distance between normal vectors of hyperplanes serves as the similarity metric. In each iteration of MRCD(k), a cluster is selected uniformly at random, and the approximate solution is updated using the equation with the maximum residual within the selected cluster. To enhance computational efficiency, we further extend this framework to a block variant, termed Block MRCD(k) or BMRCD(k), which updates the solution by simultaneously using the maximum-residual equation from each of the k clusters in every iteration. The convergence properties of both iteration methods are carefully studied. Numerical results demonstrate the robustness and the efficiency of the MRCD(k) and the BMRCD(k) iteration methods.
引用
收藏
页数:17
相关论文
共 20 条
  • [1] On convergence rate of the randomized Gauss-Seidel method
    Bai, Zhong-Zhi
    Wang, Lu
    Wu, Wen-Ting
    [J]. LINEAR ALGEBRA AND ITS APPLICATIONS, 2021, 611 : 237 - 252
  • [2] On greedy randomized coordinate descent methods for solving large linear least-squares problems
    Bai, Zhong-Zhi
    Wu, Wen-Ting
    [J]. NUMERICAL LINEAR ALGEBRA WITH APPLICATIONS, 2019, 26 (04)
  • [3] ON GREEDY RANDOMIZED KACZMARZ METHOD FOR SOLVING LARGE SPARSE LINEAR SYSTEMS
    Bai, Zhong-Zhi
    Wu, Wen-Ting
    [J]. SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2018, 40 (01) : A592 - A606
  • [4] The University of Florida Sparse Matrix Collection
    Davis, Timothy A.
    Hu, Yifan
    [J]. ACM TRANSACTIONS ON MATHEMATICAL SOFTWARE, 2011, 38 (01):
  • [5] The randomized Kaczmarz algorithm with the probability distribution depending on the angle
    He, Songnian
    Dong, Qiao-Li
    Li, Xiaoxiao
    [J]. NUMERICAL ALGORITHMS, 2023, 93 (01) : 415 - 440
  • [6] Randomized block Kaczmarz methods with k-means clustering for solving large linear systems
    Jiang, Xiang-Long
    Zhang, Ke
    Yin, Jun-Feng
    [J]. JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2022, 403
  • [7] Randomized Methods for Linear Constraints: Convergence Rates and Conditioning
    Leventhal, D.
    Lewis, A. S.
    [J]. MATHEMATICS OF OPERATIONS RESEARCH, 2010, 35 (03) : 641 - 654
  • [8] Li W-F., 2023, J. Appl. Math. Phys, V11, P1036, DOI [10.4236/jamp.2023.114068, DOI 10.4236/JAMP.2023.114068]
  • [9] Kaczmarz method with oblique projection
    Li, Weiguo
    Wang, Qin
    Bao, Wendi
    Xing, Lili
    [J]. RESULTS IN APPLIED MATHEMATICS, 2022, 16
  • [10] On maximum residual block and two-step Gauss-Seidel algorithms for linear least-squares problems
    Liu, Yong
    Jiang, Xiang-Long
    Gu, Chuan-Qing
    [J]. CALCOLO, 2021, 58 (02)