A family of conjugate gradient methods with guaranteed positiveness and descent for vector optimization

被引:1
作者
He, Qing-Rui [1 ]
Li, Sheng-Jie [1 ,2 ]
Zhang, Bo-Ya [1 ]
Chen, Chun-Rong [1 ,2 ]
机构
[1] Chongqing Univ, Coll Math & Stat, Chongqing 401331, Peoples R China
[2] Chongqing Univ, Key Lab Nonlinear Anal & Its Applicat, Minist Educ, Chongqing, Peoples R China
基金
中国国家自然科学基金;
关键词
Vector optimization; Conjugate gradient method; Sufficient descent condition; Global convergence; Line search algorithm; CONSTRAINED MULTIOBJECTIVE OPTIMIZATION; METRIC SSVM ALGORITHMS; NEWTON-LIKE METHODS; PROXIMAL METHODS; CONVERGENCE;
D O I
10.1007/s10589-024-00609-0
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
In this paper, we seek a new modification way to ensure the positiveness of the conjugate parameter and, based on the Dai-Yuan (DY) method in the vector setting, propose an associated family of conjugate gradient (CG) methods with guaranteed descent for solving unconstrained vector optimization problems. Several special members of the family are analyzed and the (sufficient) descent condition is established for them (in the vector sense). Under mild conditions, a general convergence result for the CG methods with specific parameters is presented, which, in particular, covers the global convergence of the aforementioned members. Furthermore, for the purpose of comparison, we then consider the direct extension versions of some Dai-Yuan type methods which are obtained by modifying the DY method of the scalar case. These vector extensions can retrieve the classical parameters in the scalar minimization case and their descent property and global convergence are also studied under mild assumptions. Finally, numerical experiments are given to illustrate the practical behavior of all proposed methods.
引用
收藏
页码:805 / 842
页数:38
相关论文
共 61 条
[21]   Accelerated Diagonal Steepest Descent Method for Unconstrained Multiobjective Optimization [J].
El Moudden, Mustapha ;
El Mouatasim, Abdelkrim .
JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2021, 188 (01) :220-242
[22]  
Fletcher R., 2013, Practical Methods of Optimization, DOI 10.1002/9781118723203
[23]   Steepest descent methods for multicriteria optimization [J].
Fliege, J ;
Svaiter, BF .
MATHEMATICAL METHODS OF OPERATIONS RESEARCH, 2000, 51 (03) :479-494
[24]   NEWTON'S METHOD FOR MULTIOBJECTIVE OPTIMIZATION [J].
Fliege, J. ;
Grana Drummond, L. M. ;
Svaiter, B. F. .
SIAM JOURNAL ON OPTIMIZATION, 2009, 20 (02) :602-626
[25]   A conjugate directions-type procedure for quadratic multiobjective optimization [J].
Fukuda, Ellen H. ;
Grana Drummond, L. M. ;
Masuda, Ariane M. .
OPTIMIZATION, 2022, 71 (02) :419-437
[26]   Inexact projected gradient method for vector optimization [J].
Fukuda, Ellen H. ;
Grana Drummond, L. M. .
COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2013, 54 (03) :473-493
[27]   GLOBAL CONVERGENCE PROPERTIES OF CONJUGATE GRADIENT METHODS FOR OPTIMIZATION [J].
Gilbert, Jean Charles ;
Nocedal, Jorge .
SIAM JOURNAL ON OPTIMIZATION, 1992, 2 (01) :21-42
[28]   Globally convergent Newton-type methods for multiobjective optimization [J].
Goncalves, M. L. N. ;
Lima, F. S. ;
Prudente, L. F. .
COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2022, 83 (02) :403-434
[29]   A study of Liu-Storey conjugate gradient methods for vector optimization [J].
Goncalves, M. L. N. ;
Lima, F. S. ;
Prudente, L. F. .
APPLIED MATHEMATICS AND COMPUTATION, 2022, 425
[30]   On the extension of the Hager-Zhang conjugate gradient method for vector optimization [J].
Goncalves, M. L. N. ;
Prudente, L. F. .
COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2020, 76 (03) :889-916