Tsallis entropy based uncertainty relations on sparse representation for vector and matrix signals

被引:1
作者
Xu, Guanlei [1 ]
Xu, Xiaogang [1 ]
Wang, Xiaotong [1 ]
机构
[1] Zhejiang Gongshang Univ, Coll Comp & Informat Engn, Hangzhou 310018, Peoples R China
关键词
Tsallis entropy; Sparse representation; Matrix eigendecomposition; Matrix eigenvalue and eigenvector; Correlation between basis; PRINCIPLES; RECONSTRUCTION; EQUIVALENCE; MODEL; PAIRS;
D O I
10.1016/j.ins.2022.10.100
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, the Tsallis entropy based novel uncertainty relations on vector signals and matrix signals in terms of sparse representation are deduced for the first time. These new uncertainty bounds are not only related with the entropy parameter, but also related with the vector length, the min non-zero correlation between standard orthogonal basis and the given signals, the max correlation between the two given orthogonal basis and even the eigenvalues along with eigenvectors. Especially, the relationship between uncer-tainty bounds and matrix eigenvalues is discussed as well, as discloses the new interesting interpretation on sparse representation. In addition, the theoretical analysis and numerical examples have been shown to verify these newly proposed uncertainty principles, e.g., under the case of special parameters of Tsallis entropy, the uncertainty bound reaches its peak value for the sparsest representation of matrix (i.e., only one eigenvalue is not zero). Moreover, various numerical relations between uncertainty bounds and Tsallis entropy parameters are shown in perceptual form, as maybe give us the possible enlight-enment or guidance in future sparse representation analysis.(c) 2022 Elsevier Inc. All rights reserved.
引用
收藏
页码:359 / 372
页数:14
相关论文
共 48 条
[1]   A unified formulation of entropy and its application [J].
Balakrishnan, Narayanaswamy ;
Buono, Francesco ;
Longobardi, Maria .
PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2022, 596
[2]   Wavelet based seismic signal de-noising using Shannon and Tsallis entropy [J].
Beenamol, M. ;
Prabavathy, S. ;
Mohanalin, J. .
COMPUTERS & MATHEMATICS WITH APPLICATIONS, 2012, 64 (11) :3580-3593
[3]   Sparse model identification using a forward orthogonal regression algorithm aided by mutual information [J].
Billings, Stephen A. ;
Wei, Hua-Liang .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 2007, 18 (01) :306-310
[4]   Robust uncertainty principles:: Exact signal reconstruction from highly incomplete frequency information [J].
Candès, EJ ;
Romberg, J ;
Tao, T .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (02) :489-509
[5]   Decoding by linear programming [J].
Candes, EJ ;
Tao, T .
IEEE TRANSACTIONS ON INFORMATION THEORY, 2005, 51 (12) :4203-4215
[6]   Sparsity and incoherence in compressive sampling [J].
Candes, Emmanuel ;
Romberg, Justin .
INVERSE PROBLEMS, 2007, 23 (03) :969-985
[7]   A graph-based approach for positive and unlabeled learning [J].
Carnevali, Julio Cesar ;
Rossi, Rafael Geraldeli ;
Milios, Evangelos ;
Lopes, Alneu de Andrade .
INFORMATION SCIENCES, 2021, 580 :655-672
[8]   Atomic decomposition by basis pursuit [J].
Chen, SSB ;
Donoho, DL ;
Saunders, MA .
SIAM JOURNAL ON SCIENTIFIC COMPUTING, 1998, 20 (01) :33-61
[9]   A Tighter Uncertainty Principle for Linear Canonical Transform in Terms of Phase Derivative [J].
Dang, Pei ;
Deng, Guan-Tie ;
Qian, Tao .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2013, 61 (21) :5153-5164
[10]   Adaptive greedy approximations [J].
Davis, G ;
Mallat, S ;
Avellaneda, M .
CONSTRUCTIVE APPROXIMATION, 1997, 13 (01) :57-98