An adaptive competitive penalty method for nonsmooth constrained optimization

被引:0
作者
N. Mahdavi-Amiri
M. Shaeiri
机构
[1] Sharif University of Technology,Faculty of Mathematical Sciences
来源
Numerical Algorithms | 2017年 / 75卷
关键词
Nonsmooth optimization; Conjugate gradient; Constrained problem; Goldstein subdifferential;
D O I
暂无
中图分类号
学科分类号
摘要
We present a competitive algorithm to minimize a locally Lipschitz function constrained with locally Lipschitz constraints. The approach is to use an ℓ1 nonsmooth penalty function. The method generates second order descent directions to minimize the ℓ1 penalty function. We introduce a new criterion to decide upon acceptability of a Goldstein subdifferential approximation. We show that the new criterion leads to an improvement of the Goldstein subdifferential approximation, as introduced by Mahdavi-Amiri and Yousefpour. Also, making use of our proposed line search strategy, the method always moves on differentiable points. Furthermore, the method has an adaptive behaviour in the sense that, when the iterates move on adequately smooth regions, the search directions switch exactly to the Shanno’s conjugate gradient directions and no subdifferential approximation is computed. The global convergence of the algorithm is established. Finally, an extensive comparison of the results obtained by our proposed algorithm with the ones obtained by some well-known methods shows significant reductions in number of function and gradient evaluations.
引用
收藏
页码:305 / 336
页数:31
相关论文
共 56 条
  • [21] Lukšan L(1998)Feasible direction interior point technique for nonlinear optimization J. Optim. Theory and Appl. 99 121-146
  • [22] Vlček J(1997)On the computer implementation of feasible direction interior point algorithms for nonlinear optimization Struct. Optim. 14 165-172
  • [23] Chen X(2005)A robust gradient sampling algorithm for nonsmooth nonconvex optimization SIAM J. Optim. 15 751-779
  • [24] Fukushima M(1978)Conjugate gradient methods with inexact searches Math. Oper. Res. 3 244-256
  • [25] Curtis FE(2013)An adaptive gradient sampling algorithm for nonsmooth optimization Optim. Methods and Softw. 28 1302-1324
  • [26] Overton ML(1984)Exact penalty functions and stability in locally Lipschitz programming Math. Program. 30 340-356
  • [27] Karmitsa N(2013)Nonsmooth optimization via quasi-Newton methods Math. Program. 141 135-163
  • [28] Mäkelä MM(2007)Convergence of the gradient sampling algorithm for nonsmooth nonconvex optimization SIAM J. Optim. 18 379-388
  • [29] Ali MM(1975)Optimally conditioned optimization algorithms without line searches Math. Program. 9 1-30
  • [30] Goldstein AA(1978)On the convergence of a new conjugate gradient algorithm SIAM J. Numer. Anal. 15 1247-1257