Nonlinear conjugate gradient methods for unconstrained set optimization problems whose objective functions have finite cardinality

被引:2
|
作者
Kumar, Krishan [1 ]
Ghosh, Debdas [1 ]
Yao, Jen-Chih [2 ,3 ]
Zhao, Xiaopeng [4 ]
机构
[1] Indian Inst Technol BHU, Dept Math Sci, Varanasi, India
[2] China Med Univ, Ctr Gen Educ, Taichung, Taiwan
[3] Acad Romanian Scientists, Bucharest, Romania
[4] Tiangong Univ, Sch Math Sci, Tianjin, Peoples R China
关键词
Set-valued optimization; conjugate gradient method; weakly minimal solutions; Wolfe line search; Zoutendijk-type condition; GLOBAL CONVERGENCE; ORDER RELATIONS; DESCENT METHOD; SCALARIZATION; ROBUSTNESS;
D O I
10.1080/02331934.2024.2390116
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
In this paper, we propose nonlinear conjugate gradient methods for unconstrained set optimization problems in which the objective function is given by a finite number of continuously differentiable vector-valued functions. First, we provide a general algorithm for the conjugate gradient method using Wolfe line search but without imposing an explicit restriction on the conjugate parameter. Later, we study two variants of the algorithm, namely Fletcher-Reeves and conjugate descent, using two different choices of the conjugate parameter but with the same line search rule. In the general algorithm, the direction of movement at each iterate is identified by finding out a descent direction of a vector optimization problem. This vector optimization problem is identified with the help of the concept of partition set at the current iterate. As this vector optimization problem is different at different iterations, the conventional conjugate gradient method for vector optimization cannot be straightly extended to solve the set optimization problem under consideration. The well-definedness of the methods is provided. Further, we prove the Zoutendijk-type condition, which assists in proving the global convergence of the methods even without a regular condition of the stationary points. No convexity assumption is assumed on the objective function to prove the convergence. Lastly, some numerical examples are illustrated to exhibit the performance of the proposed method. We compare the performance of the proposed conjugate gradient methods with the existing steepest descent method. It is found that the proposed method commonly outperforms the existing steepest descent method for set optimization.
引用
收藏
页数:40
相关论文
共 50 条
  • [41] A conjugate gradient algorithm for large-scale unconstrained optimization problems and nonlinear equations
    Yuan, Gonglin
    Hu, Wujie
    JOURNAL OF INEQUALITIES AND APPLICATIONS, 2018,
  • [42] A Class of Nonmonotone Conjugate Gradient Methods for Unconstrained Optimization
    G. H. Liu
    L. L. Jing
    L. X. Han
    D. Han
    Journal of Optimization Theory and Applications, 1999, 101 : 127 - 140
  • [43] Multi-step nonlinear conjugate gradient methods for unconstrained minimization
    John A. Ford
    Yasushi Narushima
    Hiroshi Yabe
    Computational Optimization and Applications, 2008, 40 : 191 - 216
  • [44] A sufficient descent Dai-Yuan type nonlinear conjugate gradient method for unconstrained optimization problems
    Jiang, Xian-zhen
    Jian, Jin-bao
    NONLINEAR DYNAMICS, 2013, 72 (1-2) : 101 - 112
  • [45] A class of nonmonotone conjugate gradient methods for unconstrained optimization
    Liu, GH
    Jing, LL
    Han, LX
    Han, D
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 1999, 101 (01) : 127 - 140
  • [46] Enhanced spectral conjugate gradient methods for unconstrained optimization
    Laylani, Yoksal A.
    Hassan, Basim A.
    Khudhur, Hisham M.
    INTERNATIONAL JOURNAL OF MATHEMATICS AND COMPUTER SCIENCE, 2023, 18 (02) : 163 - 172
  • [47] A conjugate gradient algorithm for large-scale unconstrained optimization problems and nonlinear equations
    Gonglin Yuan
    Wujie Hu
    Journal of Inequalities and Applications, 2018
  • [48] A Barzilai and Borwein scaling conjugate gradient method for unconstrained optimization problems
    Wang, Liumei
    Sun, Wenyu
    de Sampaio, Raimundo J. B.
    Yuan, Jinyun
    APPLIED MATHEMATICS AND COMPUTATION, 2015, 262 : 136 - 144
  • [49] New nonlinear conjugate gradient formulas for large-scale unconstrained optimization problems
    Wei, Zengxin
    Li, Guoyin
    Qi, Liqun
    APPLIED MATHEMATICS AND COMPUTATION, 2006, 179 (02) : 407 - 430
  • [50] Multi-step nonlinear conjugate gradient methods for unconstrained minimization
    Ford, John A.
    Narushima, Yasushi
    Yabe, Hiroshi
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2008, 40 (02) : 191 - 216