Learning broad learning system with controllable sparsity through L0 regularization

被引:7
|
作者
Chu, Fei [1 ,2 ,3 ]
Wang, Guanghui [2 ]
Wang, Jun [2 ]
Chen, C. L. Philip [4 ]
Wang, Xuesong [2 ]
机构
[1] China Univ Min & Technol, Artificial Intelligence Res Inst, Xuzhou 221116, Peoples R China
[2] China Univ Min & Technol, Sch Informat & Control Engn, Xuzhou 221116, Peoples R China
[3] Beijing Gen Res Inst Min & Met, State Key Lab Automat Control Technol Min & Met P, Beijing 100160, Peoples R China
[4] South China Univ Technol, Sch Comp Sci & Engn, Guangzhou 510641, Peoples R China
基金
中国国家自然科学基金;
关键词
Broad Learning System (BLS); Network compression; Sparse representation; Controllable; Normalized iterative hard thresholding; (NIHT); NEURAL-NETWORKS; APPROXIMATION;
D O I
10.1016/j.asoc.2023.110068
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
As a novel neural network with efficient learning capacity, broad learning system (BLS) has achieved remarkable success in various regression and classification problems. Due to the broad expansion of nodes, however, BLS is known to have many redundant parameters and nodes, which will increase the memory and computation cost and is adverse to its deployment on equipment with limited resources. To optimize the number of neurons and parameters of BLS and then find the optimal sparse model under a given resource budget, in this paper, we introduce to train BLS through L0 regularization. The regularization constraint term of the BLS objective function is replaced by the L0 regularization method, and the normalized hard threshold iterative method is used to optimize the output weight. More concretely, the size of the model is fixed by controlling the number of output weights under given the resource size, and then parameters and nodes in the network are evaluated and selected from the node set in the training to obtain a BLS with controllable sparsity (CSBLS). Experiments on various data sets demonstrate the effectiveness of our proposed method. (C) 2023 Published by Elsevier B.V.
引用
收藏
页数:9
相关论文
共 50 条
  • [31] DESTRIPING ALGORITHM WITH L0 SPARSITY PRIOR FOR REMOTE SENSING IMAGES
    Liu, Hai
    Zhang, Zhaoli
    Liu, Sanya
    Liu, Tingting
    Chang, Yi
    2015 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2015, : 2295 - 2299
  • [32] Blind Deconvolution With Nonlocal Similarity and l0 Sparsity for Noisy Image
    Ren, Weihong
    Tian, Jiandong
    Tang, Yandong
    IEEE SIGNAL PROCESSING LETTERS, 2016, 23 (04) : 439 - 443
  • [33] Towards Compact Broad Learning System by Combined Sparse Regularization
    Miao, Jianyu
    Yang, Tiejun
    Jin, Jun-Wei
    Sun, Lijun
    Niu, Lingfeng
    Shi, Yong
    INTERNATIONAL JOURNAL OF INFORMATION TECHNOLOGY & DECISION MAKING, 2022, 21 (01) : 169 - 194
  • [34] Semi-Supervised Broad Learning System Based on Manifold Regularization and Broad Network
    Zhao, Huimin
    Zheng, Jianjie
    Deng, Wu
    Song, Yingjie
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-REGULAR PAPERS, 2020, 67 (03) : 983 - 994
  • [35] l0 Sparsifying Transform Learning With Efficient Optimal Updates and Convergence Guarantees
    Ravishankar, Saiprasad
    Bresler, Yoram
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2015, 63 (09) : 2389 - 2404
  • [36] Compound L0 regularization method for image blind motion deblurring
    Liu, Qiaohong
    Sun, Liping
    Shao, Zeguo
    JOURNAL OF ELECTRONIC IMAGING, 2016, 25 (05)
  • [37] A New Smoothed L0 Regularization Approach for Sparse Signal Recovery
    Xiang, Jianhong
    Yue, Huihui
    Yin, Xiangjun
    Wang, Linyu
    MATHEMATICAL PROBLEMS IN ENGINEERING, 2019, 2019
  • [38] l0 norm based dictionary learning by proximal methods with global convergence
    Bao, Chenglong
    Ji, Hui
    Quan, Yuhui
    Shen, Zuowei
    2014 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2014, : 3858 - 3865
  • [39] On optimal solutions of the constrained l0 regularization and its penalty problem
    Zhang, Na
    Li, Qia
    INVERSE PROBLEMS, 2017, 33 (02)
  • [40] Dynamic Narrowing of VAE Bottlenecks Using GECO and L0 Regularization
    De Boom, Cedric
    Wauthier, Samuel
    Verbelen, Tim
    Dhoedt, Bart
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,