A subspace inertial method for derivative-free nonlinear monotone equations

被引:14
作者
Kimiaei, Morteza [1 ]
Ibrahim, Abdulkarim Hassan [2 ]
Ghaderi, Susan [3 ]
机构
[1] Univ Wien, Fak Math, Vienna, Austria
[2] King Fahd Univ Petr & Minerals, Interdisciplinary Res Ctr IRC Smart Mobil & Logist, Dhahran, Saudi Arabia
[3] Leuven AI KU Leuven Inst AI, Dept Elect Engn ESAT STADIUS, Leuven, Belgium
关键词
Monotone equations; derivative-free optimization; inertial technique; global convergence; complexity result; CONJUGATE-GRADIENT METHOD; WORST-CASE COMPLEXITY; ADAPTIVE RADIUS; SUPERLINEAR CONVERGENCE; ALGORITHM; SYSTEMS; DESCENT; OPERATORS;
D O I
10.1080/02331934.2023.2252849
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
We introduce a subspace inertial line search algorithm (SILSA), for finding solutions of nonlinear monotone equations (NME). At each iteration, a new point is generated in a subspace generated by the previous points. Of all finite points forming the subspace, a point with the largest residual norm is replaced by the new point to update the subspace. In this way, SILSA leaves regions far from the solution of NME and approaches regions near it, leading to a fast convergence to the solution. This study analyzes global convergence and complexity upper bounds on the number of iterations and the number of function evaluations required for SILSA. Numerical results show that SILSA is promising compared to the basic line search algorithm with several known derivative-free directions.
引用
收藏
页码:269 / 296
页数:28
相关论文
共 63 条
[1]   A descent Dai-Liao conjugate gradient method for nonlinear equations [J].
Abubakar, Auwal Bala ;
Kumam, Poom .
NUMERICAL ALGORITHMS, 2019, 81 (01) :197-210
[2]   Local convergence of the Levenberg-Marquardt method under Holder metric subregularity [J].
Ahookhosh, Masoud ;
Aragon Artacho, Francisco J. ;
Fleming, Ronan M. T. ;
Vuong, Phan T. .
ADVANCES IN COMPUTATIONAL MATHEMATICS, 2019, 45 (5-6) :2771-2806
[3]   Finding zeros of Holder metrically subregular mappings via globally convergent Levenberg-Marquardt methods [J].
Ahookhosh, Masoud ;
Fleming, Ronan M. T. ;
Vuong, Phan T. .
OPTIMIZATION METHODS & SOFTWARE, 2022, 37 (01) :113-149
[4]   Two derivative-free projection approaches for systems of large-scale nonlinear monotone equations [J].
Ahookhosh, Masoud ;
Amini, Keyvan ;
Bahrami, Somayeh .
NUMERICAL ALGORITHMS, 2013, 64 (01) :21-42
[5]   A family of three-term conjugate gradient methods with sufficient descent property for unconstrained optimization [J].
Al-Baali, Mehiddin ;
Narushima, Yasushi ;
Yabe, Hiroshi .
COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2015, 60 (01) :89-110
[6]   Weak convergence of a relaxed and inertial hybrid projection-proximal point algorithm for maximal monotone operators in Hilbert space [J].
Alvarez, F .
SIAM JOURNAL ON OPTIMIZATION, 2004, 14 (03) :773-782
[7]   An inertial proximal method for maximal monotone operators via discretization of a nonlinear oscillator with damping [J].
Alvarez, F ;
Attouch, H .
SET-VALUED ANALYSIS, 2001, 9 (1-2) :3-11
[8]   A line search trust-region algorithm with nonmonotone adaptive radius for a system of nonlinear equations [J].
Amini, Keyvan ;
Shiker, Mushtak A. K. ;
Kimiaei, Morteza .
4OR-A QUARTERLY JOURNAL OF OPERATIONS RESEARCH, 2016, 14 (02) :133-152
[9]   Modified conjugate gradient method for solving sparse recovery problem with nonconvex penalty [J].
Aminifard, Zohre ;
Hosseini, Alireza ;
Babaie-Kafaki, Saman .
SIGNAL PROCESSING, 2022, 193
[10]   A modified descent Polak-Ribiere-Polyak conjugate gradient method with global convergence property for nonconvex functions [J].
Aminifard, Zohre ;
Babaie-Kafaki, Saman .
CALCOLO, 2019, 56 (02)