An inexact regularized proximal Newton method for nonconvex and nonsmooth optimization

被引:0
|
作者
Ruyu Liu
Shaohua Pan
Yuqia Wu
Xiaoqi Yang
机构
[1] South China University of Technology,School of Mathematics
[2] The Hong Kong Polytechnic University,Department of Applied Mathematics
关键词
Nonconvex and nonsmooth optimization; Regularized proximal Newton method; Global convergence; Convergence rate; KL function; Metric ; -subregularity; 90C26; 49M15; 90C55;
D O I
暂无
中图分类号
学科分类号
摘要
This paper focuses on the minimization of a sum of a twice continuously differentiable function f and a nonsmooth convex function. An inexact regularized proximal Newton method is proposed by an approximation to the Hessian of f involving the ϱ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\varrho $$\end{document}th power of the KKT residual. For ϱ=0\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\varrho =0$$\end{document}, we justify the global convergence of the iterate sequence for the KL objective function and its R-linear convergence rate for the KL objective function of exponent 1/2. For ϱ∈(0,1)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\varrho \in (0,1)$$\end{document}, by assuming that cluster points satisfy a locally Hölderian error bound of order q on a second-order stationary point set and a local error bound of order q>1+ϱ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$q>1\!+\!\varrho $$\end{document} on the common stationary point set, respectively, we establish the global convergence of the iterate sequence and its superlinear convergence rate with order depending on q and ϱ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\varrho $$\end{document}. A dual semismooth Newton augmented Lagrangian method is also developed for seeking an inexact minimizer of subproblems. Numerical comparisons with two state-of-the-art methods on ℓ1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\ell _1$$\end{document}-regularized Student’s t-regressions, group penalized Student’s t-regressions, and nonconvex image restoration confirm the efficiency of the proposed method.
引用
收藏
页码:603 / 641
页数:38
相关论文
共 50 条
  • [31] Inexact proximal DC Newton-type method for nonconvex composite functions
    Shummin Nakayama
    Yasushi Narushima
    Hiroshi Yabe
    Computational Optimization and Applications, 2024, 87 : 611 - 640
  • [32] Inexact proximal DC Newton-type method for nonconvex composite functions
    Nakayama, Shummin
    Narushima, Yasushi
    Yabe, Hiroshi
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2024, 87 (02) : 611 - 640
  • [33] An Approximate Redistributed Proximal Bundle Method with Inexact Data for Minimizing Nonsmooth Nonconvex Functions
    Shen, Jie
    Liu, Xiao-Qian
    Guo, Fang-Fang
    Wang, Shu-Xin
    MATHEMATICAL PROBLEMS IN ENGINEERING, 2015, 2015
  • [34] Convergence Properties of the Regularized Newton Method for the Unconstrained Nonconvex Optimization
    Ueda, Kenji
    Yamashita, Nobuo
    APPLIED MATHEMATICS AND OPTIMIZATION, 2010, 62 (01): : 27 - 46
  • [35] Proximal ADMM for nonconvex and nonsmooth optimization
    Yang, Yu
    Jia, Qing-Shan
    Xu, Zhanbo
    Guan, Xiaohong
    Spanos, Costas J.
    AUTOMATICA, 2022, 146
  • [36] Convergence Properties of the Regularized Newton Method for the Unconstrained Nonconvex Optimization
    Kenji Ueda
    Nobuo Yamashita
    Applied Mathematics and Optimization, 2010, 62 : 27 - 46
  • [37] A filter proximal bundle method for nonsmooth nonconvex constrained optimization
    Najmeh Hoseini Monjezi
    S. Nobakhtian
    Journal of Global Optimization, 2021, 79 : 1 - 37
  • [38] A filter proximal bundle method for nonsmooth nonconvex constrained optimization
    Hoseini Monjezi, Najmeh
    Nobakhtian, S.
    JOURNAL OF GLOBAL OPTIMIZATION, 2021, 79 (01) : 1 - 37
  • [39] Constrained Nonconvex Nonsmooth Optimization via Proximal Bundle Method
    Yang Yang
    Liping Pang
    Xuefei Ma
    Jie Shen
    Journal of Optimization Theory and Applications, 2014, 163 : 900 - 925
  • [40] A Simple Proximal Stochastic Gradient Method for Nonsmooth Nonconvex Optimization
    Li, Zhize
    Li, Jian
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31