Global Search and Analysis for the Nonconvex Two-Level l1 Penalty

被引:1
|
作者
He, Fan [1 ,2 ]
He, Mingzhen [1 ,2 ]
Shi, Lei [3 ,4 ]
Huang, Xiaolin [1 ,2 ]
机构
[1] Shanghai Jiao Tong Univ, MOE Key Lab Syst Control & Informat Proc, Inst Image Proc & Pattern Recognit, Shanghai 200240, Peoples R China
[2] Shanghai Jiao Tong Univ, Inst Med Robot, Shanghai 200240, Peoples R China
[3] Fudan Univ, Sch Math Sci, Shanghai Key Lab Contemporary Appl Math, Shanghai 200433, Peoples R China
[4] Shanghai Artificial Intelligence Lab, Shanghai 200232, Peoples R China
基金
中国国家自然科学基金;
关键词
Compressive sensing; global search algorithm; kernel-based quantile regression; nonconvex optimization; two-level l(1) penalty; NONCONCAVE PENALIZED LIKELIHOOD; SOFT MARGIN CLASSIFIERS; VARIABLE SELECTION; SPARSE; REGULARIZATION; RECOGNITION; REGRESSION; RECOVERY;
D O I
10.1109/TNNLS.2022.3201052
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Imposing suitably designed nonconvex regularization is effective to enhance sparsity, but the corresponding global search algorithm has not been well established. In this article, we propose a global search algorithm for the nonconvex twolevel P t penalty based on its piecewise linear property and apply it to machine learning tasks. With the search capability, the optimization performance of the proposed algorithm could be improved, resulting in better sparsity and accuracy than most state-of-the-art global and local algorithms. Besides, we also provide an approximation analysis to demonstrate the effectiveness of our global search algorithm in sparse quantile regression.
引用
收藏
页码:3886 / 3899
页数:14
相关论文
共 50 条
  • [21] Group analysis of fMRI data using L1 and L2 regularization
    Overholser, Rosanna
    Xu, Ronghui
    STATISTICS AND ITS INTERFACE, 2015, 8 (03) : 379 - 390
  • [22] Spark-level sparsity and the l1 tail minimization
    Lai, Chun-Kit
    Li, Shidong
    Mondo, Daniel
    APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2018, 45 (01) : 206 - 215
  • [23] Detecting Gene-Gene Interactions Using Support Vector Machines with L1 Penalty
    Shen, Yuanyuan
    Liu, Zhe
    Ott, Jurg
    2010 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE WORKSHOPS (BIBMW), 2010, : 309 - 311
  • [24] L1 penalty and shrinkage estimation in partially linear models with random coefficient autoregressive errors
    Fallahpour, Saber
    Ahmed, S. Ejaz
    Doksum, Kjell A.
    APPLIED STOCHASTIC MODELS IN BUSINESS AND INDUSTRY, 2012, 28 (03) : 236 - 250
  • [25] An algorithm for l1 nearest neighbor search via monotonic embedding
    Wang, Xinan
    Dasgupta, Sanjoy
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [26] Nonconvex L1/2 Minimization Based Compressive Sensing Approach for Duct Azimuthal Mode Detection
    Bai, Baohong
    Li, Xiaodong
    Zhang, Tao
    Lin, Dakai
    AIAA JOURNAL, 2020, 58 (09) : 3932 - 3946
  • [27] Resolution Analysis of Imaging with l1 Optimization
    Borcea, Liliana
    Kocyigit, Ilker
    SIAM JOURNAL ON IMAGING SCIENCES, 2015, 8 (04): : 3015 - 3050
  • [28] l1 Regularization in Two-Layer Neural Networks
    Li, Gen
    Gu, Yuantao
    Ding, Jie
    IEEE SIGNAL PROCESSING LETTERS, 2022, 29 : 135 - 139
  • [29] The L1/2 regularization approach for survival analysis in the accelerated failure time model
    Chai, Hua
    Liang, Yong
    Liu, Xiao-Ying
    COMPUTERS IN BIOLOGY AND MEDICINE, 2015, 64 : 283 - 290
  • [30] Fast lithographic source pupil optimization using difference of convex functions algorithm for transformed L1 penalty
    Sun, Yiyu
    Li, Yanqiu
    Liao, Guanghui
    Yuan, Miao
    Liu, Yang
    Li, Yaning
    Zou, Lulu
    Liu, Lihui
    TWELFTH INTERNATIONAL CONFERENCE ON INFORMATION OPTICS AND PHOTONICS (CIOP 2021), 2021, 12057