Nonlinear conjugate gradient methods for unconstrained set optimization problems whose objective functions have finite cardinality

被引:2
|
作者
Kumar, Krishan [1 ]
Ghosh, Debdas [1 ]
Yao, Jen-Chih [2 ,3 ]
Zhao, Xiaopeng [4 ]
机构
[1] Indian Inst Technol BHU, Dept Math Sci, Varanasi, India
[2] China Med Univ, Ctr Gen Educ, Taichung, Taiwan
[3] Acad Romanian Scientists, Bucharest, Romania
[4] Tiangong Univ, Sch Math Sci, Tianjin, Peoples R China
关键词
Set-valued optimization; conjugate gradient method; weakly minimal solutions; Wolfe line search; Zoutendijk-type condition; GLOBAL CONVERGENCE; ORDER RELATIONS; DESCENT METHOD; SCALARIZATION; ROBUSTNESS;
D O I
10.1080/02331934.2024.2390116
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
In this paper, we propose nonlinear conjugate gradient methods for unconstrained set optimization problems in which the objective function is given by a finite number of continuously differentiable vector-valued functions. First, we provide a general algorithm for the conjugate gradient method using Wolfe line search but without imposing an explicit restriction on the conjugate parameter. Later, we study two variants of the algorithm, namely Fletcher-Reeves and conjugate descent, using two different choices of the conjugate parameter but with the same line search rule. In the general algorithm, the direction of movement at each iterate is identified by finding out a descent direction of a vector optimization problem. This vector optimization problem is identified with the help of the concept of partition set at the current iterate. As this vector optimization problem is different at different iterations, the conventional conjugate gradient method for vector optimization cannot be straightly extended to solve the set optimization problem under consideration. The well-definedness of the methods is provided. Further, we prove the Zoutendijk-type condition, which assists in proving the global convergence of the methods even without a regular condition of the stationary points. No convexity assumption is assumed on the objective function to prove the convergence. Lastly, some numerical examples are illustrated to exhibit the performance of the proposed method. We compare the performance of the proposed conjugate gradient methods with the existing steepest descent method. It is found that the proposed method commonly outperforms the existing steepest descent method for set optimization.
引用
收藏
页数:40
相关论文
共 50 条
  • [21] Some new conjugate gradient methods for solving unconstrained optimization problems
    Hassan, Basim A.
    Abdullah, Zeyad M.
    Hussein, Saif A.
    JOURNAL OF INFORMATION & OPTIMIZATION SCIENCES, 2022, 43 (04) : 893 - 903
  • [22] Two modified conjugate gradient methods for unconstrained optimization
    Abd Elhamid, Mehamdia
    Yacine, Chaib
    OPTIMIZATION METHODS & SOFTWARE, 2024,
  • [23] Two New Conjugate Gradient Methods for Unconstrained Optimization
    Feng, Huantao
    Xiao, Wei
    PROCEEDINGS OF 2008 INTERNATIONAL PRE-OLYMPIC CONGRESS ON COMPUTER SCIENCE, VOL II: INFORMATION SCIENCE AND ENGINEERING, 2008, : 462 - 465
  • [24] NONLINEAR CONJUGATE GRADIENT METHODS WITH SUFFICIENT DESCENT PROPERTIES FOR UNCONSTRAINED OPTIMIZATION
    Nakamura, Wataru
    Narushima, Yasushi
    Yabe, Hiroshi
    JOURNAL OF INDUSTRIAL AND MANAGEMENT OPTIMIZATION, 2013, 9 (03) : 595 - 619
  • [25] Two modified conjugate gradient methods for unconstrained optimization with applications in image restoration problems
    Guodong Ma
    Hui Lin
    Wenhui Jin
    Daolan Han
    Journal of Applied Mathematics and Computing, 2022, 68 : 4733 - 4758
  • [26] Two modified conjugate gradient methods for unconstrained optimization with applications in image restoration problems
    Ma, Guodong
    Lin, Hui
    Jin, Wenhui
    Han, Daolan
    JOURNAL OF APPLIED MATHEMATICS AND COMPUTING, 2022, 68 (06) : 4733 - 4758
  • [27] Solving unconstrained optimization problems with some three-term conjugate gradient methods
    Arman, Ladan
    Xu, Yuanming
    Bayat, Mohammad Reza
    Long, Liping
    TAMKANG JOURNAL OF MATHEMATICS, 2023, 54 (02): : 139 - 154
  • [28] GLOBALLY CONVERGENCE OF NONLINEAR CONJUGATE GRADIENT METHOD FOR UNCONSTRAINED OPTIMIZATION
    Sellami, B.
    Belloufi, M.
    Chaib, Y.
    RAIRO-OPERATIONS RESEARCH, 2017, 51 (04) : 1101 - 1117
  • [29] A modified nonlinear conjugate gradient algorithm for unconstrained optimization and portfolio selection problems
    Diphofu, Thamiso
    Kaelo, Professor
    Tufa, Abebe R.
    RAIRO-OPERATIONS RESEARCH, 2023, 57 (02) : 817 - 835
  • [30] Global Convergence of Conjugate Gradient Method in Unconstrained Optimization Problems
    Najm, Huda Y.
    Hamed, Eman
    Ahmed, Huda, I
    BOLETIM SOCIEDADE PARANAENSE DE MATEMATICA, 2020, 38 (07): : 227 - 231