A family of conjugate gradient methods with guaranteed positiveness and descent for vector optimization

被引:1
作者
He, Qing-Rui [1 ]
Li, Sheng-Jie [1 ,2 ]
Zhang, Bo-Ya [1 ]
Chen, Chun-Rong [1 ,2 ]
机构
[1] Chongqing Univ, Coll Math & Stat, Chongqing 401331, Peoples R China
[2] Chongqing Univ, Key Lab Nonlinear Anal & Its Applicat, Minist Educ, Chongqing, Peoples R China
基金
中国国家自然科学基金;
关键词
Vector optimization; Conjugate gradient method; Sufficient descent condition; Global convergence; Line search algorithm; CONSTRAINED MULTIOBJECTIVE OPTIMIZATION; METRIC SSVM ALGORITHMS; NEWTON-LIKE METHODS; PROXIMAL METHODS; CONVERGENCE;
D O I
10.1007/s10589-024-00609-0
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
In this paper, we seek a new modification way to ensure the positiveness of the conjugate parameter and, based on the Dai-Yuan (DY) method in the vector setting, propose an associated family of conjugate gradient (CG) methods with guaranteed descent for solving unconstrained vector optimization problems. Several special members of the family are analyzed and the (sufficient) descent condition is established for them (in the vector sense). Under mild conditions, a general convergence result for the CG methods with specific parameters is presented, which, in particular, covers the global convergence of the aforementioned members. Furthermore, for the purpose of comparison, we then consider the direct extension versions of some Dai-Yuan type methods which are obtained by modifying the DY method of the scalar case. These vector extensions can retrieve the classical parameters in the scalar minimization case and their descent property and global convergence are also studied under mild assumptions. Finally, numerical experiments are given to illustrate the practical behavior of all proposed methods.
引用
收藏
页码:805 / 842
页数:38
相关论文
共 61 条