共 50 条
A Nonconvex Proximal Splitting Algorithm under Moreau-Yosida Regularization
被引:0
作者:
Laude, Emanuel
[1
]
Wu, Tao
[1
]
Cremers, Daniel
[1
]
机构:
[1] Tech Univ Munich, Dept Informat, Munich, Germany
来源:
INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 84
|
2018年
/
84卷
关键词:
CONVERGENCE;
NONSMOOTH;
D O I:
暂无
中图分类号:
TP18 [人工智能理论];
学科分类号:
081104 ;
0812 ;
0835 ;
1405 ;
摘要:
We tackle highly nonconvex, nonsmooth composite optimization problems whose objectives comprise a Moreau-Yosida regularized term. Classical nonconvex proximal splitting algorithms, such as nonconvex ADMM, suffer from lack of convergence for such a problem class. To overcome this difficulty, in this work we consider a lifted variant of the Moreau-Yosida regularized model and propose a novel multiblock primal-dual algorithm that intrinsically stabilizes the dual block. We provide a complete convergence analysis of our algorithm and identify respective optimality qualifications under which stationarity of the original model is retrieved at convergence. Numerically, we demonstrate the relevance of Moreau-Yosida regularized models and the efficiency of our algorithm on robust regression as well as joint feature selection and semi-supervised learning.
引用
收藏
页数:9
相关论文
共 50 条