A Sparse Modeling Method Based on Reduction of Cost Function in Regularized Forward Selection

被引:0
|
作者
Hagiwara, Katsuyuki [1 ]
机构
[1] Mie Univ, Fac Educ, Tsu, Mie 5148507, Japan
来源
IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS | 2014年 / E97D卷 / 01期
关键词
regularized forward selection; nonparametric regression; sparse representation; thresholding method; cross validation; REGRESSION; PREDICTION; SHRINKAGE; MACHINE;
D O I
10.1587/transinf.E97.D.98
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Regularized forward selection is viewed as a method for obtaining a sparse representation in a nonparametric regression problem. In regularized forward selection, regression output is represented by a weighted sum of several significant basis functions that are selected from among a large number of candidates by using a greedy training procedure in terms of a regularized cost function and applying an appropriate model selection method. In this paper, we propose a model selection method in regularized forward selection. For the purpose, we focus on the reduction of a cost function, which is brought by appending a new basis function in a greedy training procedure. We first clarify a bias and variance decomposition of the cost reduction and then derive a probabilistic upper bound for the variance of the cost reduction under some conditions. The derived upper bound reflects an essential feature of the greedy training procedure; i.e., it selects a basis function which maximally reduces the cost function. We then propose a thresholding method for determining significant basis functions by applying the derived upper bound as a threshold level and effectively combining it with the leave-one-out cross validation method. Several numerical experiments show that generalization performance of the proposed method is comparable to that of the other methods while the number of basis functions selected by the proposed method is greatly smaller than by the other methods. We can therefore say that the proposed method is able to yield a sparse representation while keeping a relatively good generalization performance. Moreover, our method has an advantage that it is free from a selection of a regularization parameter.
引用
收藏
页码:98 / 106
页数:9
相关论文
共 50 条
  • [1] Representative selection based on sparse modeling
    Wang, Yu
    Tang, Sheng
    Zhang, Yong-Dong
    Li, Jin-Tao
    Wang, Dong
    NEUROCOMPUTING, 2014, 139 : 423 - 431
  • [2] A new learning method for single layer neural networks based on a regularized cost function
    Suárez-Romero, JA
    Fontenla-Romero, O
    Guijarro-Berdiñas, B
    Alonso-Betanzos, A
    COMPUTATIONAL METHODS IN NEURAL MODELING, PT 1, 2003, 2686 : 270 - 277
  • [3] Adaptive regularized method based on homotopy for sparse fluorescence tomography
    Xue, Zhenwen
    Ma, Xibo
    Zhang, Qian
    Wu, Ping
    Yang, Xin
    Tian, Jie
    APPLIED OPTICS, 2013, 52 (11) : 2374 - 2384
  • [4] A Gravity Forward Modeling Method based on Multiquadric Radial Basis Function
    Liu Yan
    Lv Qingtian
    Huang Yao
    She Danian
    Meng Guixiang
    Yan Jiayong
    Zhang Yongqian
    ACTA GEOLOGICA SINICA-ENGLISH EDITION, 2021, 95 : 62 - 64
  • [5] A Gravity Forward Modeling Method based on Multiquadric Radial Basis Function
    LIU Yan
    LV Qingtian
    HUANG Yao
    SHI Danian
    MENG Guixiang
    YAN Jiayong
    ZHANG Yongqian
    Acta Geologica Sinica(English Edition), 2021, (S1) : 62 - 64
  • [6] Sparse Support Vector Regressors Based on Forward Basis Selection
    Muraoka, Shigenori
    Abe, Shigeo
    IJCNN: 2009 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1- 6, 2009, : 1125 - 1129
  • [7] An inexact interior point method for L1-regularized sparse covariance selection
    Li L.
    Toh K.-C.
    Mathematical Programming Computation, 2010, 2 (3-4) : 291 - 315
  • [8] An Adaptive Regularized Subspace Pursuit based Variable Step-size Method for Power Amplifier Sparse Model Selection
    Wang, Fen
    Yu, Cuiping
    Li, Shulan
    Su, Ming
    Liu, Yuanan
    2021 IEEE MTT-S INTERNATIONAL WIRELESS SYMPOSIUM (IWS 2021), 2021,
  • [9] Forward-Backward Nonlinear Sparse Dictionary Selection based Video Summarization
    Ma, Mingyang
    Mei, Shaohui
    Wan, Shuai
    Wang, Zhiyong
    Feng, David Dagan
    2018 IEEE FOURTH INTERNATIONAL CONFERENCE ON MULTIMEDIA BIG DATA (BIGMM), 2018,
  • [10] An efficient adaptive forward-backward selection method for sparse polynomial chaos expansion
    Zhao, Huan
    Gao, Zhenghong
    Xu, Fang
    Zhang, Yidian
    Huang, Jiangtao
    COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2019, 355 : 456 - 491