An inexact regularized proximal Newton method for nonconvex and nonsmooth optimization

被引:0
|
作者
Ruyu Liu
Shaohua Pan
Yuqia Wu
Xiaoqi Yang
机构
[1] South China University of Technology,School of Mathematics
[2] The Hong Kong Polytechnic University,Department of Applied Mathematics
关键词
Nonconvex and nonsmooth optimization; Regularized proximal Newton method; Global convergence; Convergence rate; KL function; Metric ; -subregularity; 90C26; 49M15; 90C55;
D O I
暂无
中图分类号
学科分类号
摘要
This paper focuses on the minimization of a sum of a twice continuously differentiable function f and a nonsmooth convex function. An inexact regularized proximal Newton method is proposed by an approximation to the Hessian of f involving the ϱ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\varrho $$\end{document}th power of the KKT residual. For ϱ=0\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\varrho =0$$\end{document}, we justify the global convergence of the iterate sequence for the KL objective function and its R-linear convergence rate for the KL objective function of exponent 1/2. For ϱ∈(0,1)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\varrho \in (0,1)$$\end{document}, by assuming that cluster points satisfy a locally Hölderian error bound of order q on a second-order stationary point set and a local error bound of order q>1+ϱ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$q>1\!+\!\varrho $$\end{document} on the common stationary point set, respectively, we establish the global convergence of the iterate sequence and its superlinear convergence rate with order depending on q and ϱ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\varrho $$\end{document}. A dual semismooth Newton augmented Lagrangian method is also developed for seeking an inexact minimizer of subproblems. Numerical comparisons with two state-of-the-art methods on ℓ1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\ell _1$$\end{document}-regularized Student’s t-regressions, group penalized Student’s t-regressions, and nonconvex image restoration confirm the efficiency of the proposed method.
引用
收藏
页码:603 / 641
页数:38
相关论文
共 50 条
  • [41] Constrained Nonconvex Nonsmooth Optimization via Proximal Bundle Method
    Yang, Yang
    Pang, Liping
    Ma, Xuefei
    Shen, Jie
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2014, 163 (03) : 900 - 925
  • [42] A Fast and Convergent Proximal Algorithm for Regularized Nonconvex and Nonsmooth Bi-level Optimization
    Chen, Ziyi
    Kailkhura, Bhavya
    Zhou, Yi
    arXiv, 2022,
  • [43] Proximal gradient method for nonconvex and nonsmooth optimization on Hadamard manifolds
    Feng, Shuailing
    Huang, Wen
    Song, Lele
    Ying, Shihui
    Zeng, Tieyong
    OPTIMIZATION LETTERS, 2022, 16 (08) : 2277 - 2297
  • [44] Proximal gradient method for nonconvex and nonsmooth optimization on Hadamard manifolds
    Shuailing Feng
    Wen Huang
    Lele Song
    Shihui Ying
    Tieyong Zeng
    Optimization Letters, 2022, 16 : 2277 - 2297
  • [45] On the local convergence of a stochastic semismooth Newton method for nonsmooth nonconvex optimization
    Andre Milzarek
    Xiantao Xiao
    Zaiwen Wen
    Michael Ulbrich
    Science China(Mathematics), 2022, 65 (10) : 2151 - 2170
  • [46] On the local convergence of a stochastic semismooth Newton method for nonsmooth nonconvex optimization
    Milzarek, Andre
    Xiao, Xiantao
    Wen, Zaiwen
    Ulbrich, Michael
    SCIENCE CHINA-MATHEMATICS, 2022, 65 (10) : 2151 - 2170
  • [47] On the local convergence of a stochastic semismooth Newton method for nonsmooth nonconvex optimization
    Andre Milzarek
    Xiantao Xiao
    Zaiwen Wen
    Michael Ulbrich
    Science China Mathematics, 2022, 65 : 2151 - 2170
  • [48] Variable smoothing incremental aggregated gradient method for nonsmooth nonconvex regularized optimization
    Yuncheng Liu
    Fuquan Xia
    Optimization Letters, 2021, 15 : 2147 - 2164
  • [49] Variable smoothing incremental aggregated gradient method for nonsmooth nonconvex regularized optimization
    Liu, Yuncheng
    Xia, Fuquan
    OPTIMIZATION LETTERS, 2021, 15 (06) : 2147 - 2164
  • [50] A proximal bundle method for a class of nonconvex nonsmooth composite optimization problems
    Pang, Liping
    Wang, Xiaoliang
    Meng, Fanyun
    JOURNAL OF GLOBAL OPTIMIZATION, 2023, 86 (03) : 589 - 620