Sparse learning with concave regularization: relaxation of the irrepresentable condition

被引:0
|
作者
Cerone, V [1 ]
Fosson, S. M. [1 ]
Regruto, D. [1 ]
Salam, A. [1 ]
机构
[1] Politecn Torino, Dipartimento Automat & Informat, Corso Duca Abruzzi 24, I-10129 Turin, Italy
来源
2020 59TH IEEE CONFERENCE ON DECISION AND CONTROL (CDC) | 2020年
关键词
VARIABLE SELECTION;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Learning sparse models from data is an important task in all those frameworks where relevant information should be identified within a large dataset. This can be achieved by formulating and solving suitable sparsity promoting optimization problems. As to linear regression models, Lasso is the most popular convex approach, based on an '1-norm regularization. In contrast, in this paper, we analyse a concave regularized approach, and we prove that it relaxes the irrepresentable condition, which is sufficient and essentially necessary for Lasso to select the right significant parameters. In practice, this has the benefit of reducing the number of necessary measurements with respect to Lasso. Since the proposed problem is non-convex, we also discuss different algorithms to solve it, and we illustrate the obtained enhancement via numerical experiments.
引用
收藏
页码:396 / 401
页数:6
相关论文
共 50 条
  • [31] Spatiotemporal Learning via Mixture Importance Gaussian Filtering With Sparse Regularization
    Zhang, Hongwei
    Ye, Xiaoyu
    Hu, Qi
    IEEE SIGNAL PROCESSING LETTERS, 2023, 30 : 279 - 283
  • [32] Learning Sparse Neural Networks via Sensitivity-Driven Regularization
    Tartaglione, Enzo
    Lepsoy, Skjalg
    Fiandrotti, Attilio
    Francini, Gianluca
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [33] Learning Sparse Neural Networks Through Mixture-Distributed Regularization
    Huang, Chang-Ti
    Chen, Jun-Cheng
    Wu, Ja-Ling
    2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW 2020), 2020, : 2968 - 2977
  • [34] Learning Sparse Low-Precision Neural Networks With Learnable Regularization
    Choi, Yoojin
    El-Khamy, Mostafa
    Lee, Jungwon
    IEEE ACCESS, 2020, 8 : 96963 - 96974
  • [35] Learning Sparse Neural Networks Using Non-Convex Regularization
    Pandit, Mohammad Khalid
    Naaz, Roohie
    Chishti, Mohammad Ahsan
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2022, 6 (02): : 287 - 299
  • [36] On the Regularization Parameter Selection for Sparse Code Learning in Electrical Source Separation
    Figueiredo, Marisa
    Ribeiro, Bernardete
    de Almeida, Ana Maria
    ADAPTIVE AND NATURAL COMPUTING ALGORITHMS, ICANNGA 2013, 2013, 7824 : 277 - 286
  • [37] Fast learning rate of non-sparse multiple kernel learning and optimal regularization strategies
    Suzuki, Taiji
    ELECTRONIC JOURNAL OF STATISTICS, 2018, 12 (02): : 2141 - 2192
  • [38] Learning Sparse Graph with Minimax Concave Penalty under Gaussian Markov Random Fields
    Koyakumaru, Tatsuya
    Yukawa, Masahiro
    Pavez, Eduardo
    Ortega, Antonio
    IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES, 2023, E106A (01) : 23 - 34
  • [39] Learning Sparse Graph with Minimax Concave Penalty under Gaussian Markov Random Fields
    Koyakumaru, Tatsuya
    Yukawa, Masahiro
    Pavez, Eduardo
    Ortega, Antonio
    IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES, 2022, E105 (08)
  • [40] Sparse regularization for precipitation downscaling
    Ebtehaj, A. M.
    Foufoula-Georgiou, E.
    Lerman, G.
    JOURNAL OF GEOPHYSICAL RESEARCH-ATMOSPHERES, 2012, 117