Non-negative and local sparse coding based on l2-norm and Hessian regularization

被引:6
|
作者
Zhang, Jinghui [1 ]
Wan, Yuan [1 ]
Chen, Zhiping [1 ]
Meng, Xiaojing [1 ]
机构
[1] Wuhan Univ Technol, Sch Sci, Wuhan 430070, Hubei, Peoples R China
关键词
Hessian regularization; Elastic net model; Sparse coding; Image classification;
D O I
10.1016/j.ins.2019.02.024
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Due to the efficiency of representing visual data and reducing dimension of complex structure, methods of sparse coding have been widely investigated and achieved ideal performance in image classification. These sparse coding methods learn both a dictionary and the sparse codes from the original data together under the constraint to l(1)-norm. However, the introduction of( )l(1)-norm tends to choose small number of atoms from the relevant bases in process of dictionary learning, abandoning other high-related bases, which results in the neglect of group effect and weak generalization of the model. In this paper, we propose a novel sparse coding model which introduces the l(2)-norm constraint and the second-order Hessian energy in the optimization function. This model eliminates the restrictions on the number of selected base vectors in the dictionary learning, and makes better use of the topological structure information as well, thus the intrinsic geometric characteristics of the data is described more accurately. In addition, our model is extended with a non-negative local constraint, which ensures similar features to share their local bases. Extensive experimental results on the real-world datasets show that the proposed model extraordinarily outperforms several state-of-the-art image representative methods. (C) 2019 Elsevier Inc. All rights reserved.
引用
收藏
页码:88 / 100
页数:13
相关论文
共 50 条
  • [1] Robust coding via L2-norm regularization for visual tracking
    Yuan, Guang-Lin
    Xue, Mo-Gen
    Dianzi Yu Xinxi Xuebao/Journal of Electronics and Information Technology, 2014, 36 (08): : 1838 - 1843
  • [2] Non-negative Local Sparse Coding for Subspace Clustering
    Hosseini, Babak
    Hammer, Barbara
    ADVANCES IN INTELLIGENT DATA ANALYSIS XVII, IDA 2018, 2018, 11191 : 137 - 150
  • [3] l1- and l2-Norm Joint Regularization Based Sparse Signal Reconstruction Scheme
    Liu, Chanzi
    Chen, Qingchun
    Zhou, Bingpeng
    Li, Hengchao
    MATHEMATICAL PROBLEMS IN ENGINEERING, 2016, 2016
  • [4] Non-negative sparse coding
    Hoyer, PO
    NEURAL NETWORKS FOR SIGNAL PROCESSING XII, PROCEEDINGS, 2002, : 557 - 565
  • [5] A-Optimal Non-negative Projection with Hessian regularization
    Yang, Zheng
    Liu, Haifeng
    Cai, Deng
    Wu, Zhaohui
    NEUROCOMPUTING, 2016, 174 : 838 - 849
  • [6] Non-negative sparse decomposition based on constrained smoothed l0 norm
    Mohammadi, M. R.
    Fatemizadeh, E.
    Mahoor, M. H.
    SIGNAL PROCESSING, 2014, 100 : 42 - 50
  • [7] Convolutive Non-Negative Sparse Coding
    Wang, Wenwu
    2008 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-8, 2008, : 3681 - 3684
  • [9] Weighted Sparse Graph Non-negative Matrix Factorization based on L21 norm
    Wang, Dan
    Xie, Guo
    Ji, WenJiang
    Liang, LiLi
    Gao, Huan
    Chen, Pang
    Liu, Han
    PROCEEDINGS OF THE 32ND 2020 CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2020), 2020, : 4114 - 4118
  • [10] Dispersion Constraint Based Non-negative Sparse Coding Model
    Wang, Xin
    Wang, Can
    Shang, Li
    Sun, Zhan-Li
    NEURAL PROCESSING LETTERS, 2016, 43 (02) : 603 - 609