Flexible and Comprehensive Framework of Element Selection Based on Nonconvex Sparse Optimization

被引:1
作者
Kawamura, Taiga [1 ]
Ueno, Natsuki [1 ]
Ono, Nobutaka [1 ]
机构
[1] Tokyo Metropolitan Univ, Grad Sch Syst Design, Tokyo 1910065, Japan
基金
日本科学技术振兴机构;
关键词
Optimization; Relaxation methods; Minimization; Signal processing; Dimensionality reduction; Sparse matrices; Indexes; element selection; sparse optimization; proximal operator; Douglas-Rachford splitting method; REGULARIZATION; ALGORITHMS;
D O I
10.1109/ACCESS.2024.3361941
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We propose an element selection method for high-dimensional data that is applicable to a wide range of optimization criteria in a unifying manner. Element selection is a fundamental technique for reducing dimensionality of high-dimensional data by simple operations without the use of scalar multiplication. Restorability is one of the commonly used criteria in element selection, and the element selection problem based on restorability is formulated as a minimization problem of a loss function representing the restoration error between the original data and the restored data. However, conventional methods are applicable only to a limited class of loss functions such as & ell;(2) norm loss. To enable the use of a wide variety of criteria, we reformulate the element selection problem as a nonconvex sparse optimization problem and derive the optimization algorithm based on Douglas-Rachford splitting method. The proposed algorithm is applicable to any loss function as long as its proximal operator is available, e.g., & ell;(1) norm loss and & ell;(infinity) norm loss as well as & ell;(2) norm loss. We conducted numerical experiments using artificial and real data, and their results indicate that the above loss functions are successfully minimized by the proposed algorithm.
引用
收藏
页码:21337 / 21346
页数:10
相关论文
共 50 条
  • [21] Nonconvex Lagrangian-Based Optimization: Monitoring Schemes and Global Convergence
    Bolte, Jerome
    Sabach, Shoham
    Teboulle, Marc
    MATHEMATICS OF OPERATIONS RESEARCH, 2018, 43 (04) : 1210 - 1232
  • [22] Unified Algorithm Framework for Nonconvex Stochastic Optimization in Deep Neural Networks
    Zhu, Yini
    Iiduka, Hideaki
    IEEE ACCESS, 2021, 9 : 143807 - 143823
  • [23] Nonconvex Log-Sum Function-Based Majorization-Minimization Framework for Seismic Data Reconstruction
    Zhang, Wanjuan
    Fu, Lihua
    Liu, Qun
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2019, 16 (11) : 1776 - 1780
  • [24] A nonconvex sparse recovery method for DOA estimation based on the trimmed lasso
    Bai, Longxin
    Zhang, Jingchao
    Qiao, Liyan
    DIGITAL SIGNAL PROCESSING, 2024, 153
  • [25] A robust mixed error coding method based on nonconvex sparse representation
    Lv, Wei
    Zhang, Chao
    Li, Huaxiong
    Wang, Bo
    Chen, Chunlin
    INFORMATION SCIENCES, 2023, 635 : 56 - 71
  • [26] A MULTILEVEL FRAMEWORK FOR SPARSE OPTIMIZATION WITH APPLICATION TO INVERSE COVARIANCE ESTIMATION AND LOGISTIC REGRESSION
    Treister, Eran
    Turek, Javier S.
    Yavneh, Irad
    SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2016, 38 (05) : S566 - S592
  • [27] A PATH-BASED APPROACH TO CONSTRAINED SPARSE OPTIMIZATION
    Hallak, Nadav
    SIAM JOURNAL ON OPTIMIZATION, 2024, 34 (01) : 790 - 816
  • [28] A Flexible Distributed Stochastic Optimization Framework for Concurrent Tasks in Processing Networks
    Shi, Zai
    Eryilmaz, Atilla
    IEEE-ACM TRANSACTIONS ON NETWORKING, 2021, 29 (05) : 2045 - 2058
  • [29] Sparse Unmixing for Hyperspectral Imagery via Comprehensive-Learning-Based Particle Swarm Optimization
    Miao, Yapeng
    Yang, Bin
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2021, 14 : 9727 - 9742
  • [30] Research on Dung Beetle Optimization Based Stacked Sparse Autoencoder for Network Situation Element Extraction
    Yang, Yongchao
    Zhao, Pan
    IEEE ACCESS, 2024, 12 : 24014 - 24026