A note on the blind deconvolution of multiple sparse signals from unknown subspaces

被引:7
|
作者
Cosse, Augustin [1 ,2 ,3 ]
机构
[1] NYU, Courant Inst Math Sci, New York, NY 10003 USA
[2] NYU, Ctr Data Sci, NYC, New York, NY 10003 USA
[3] Ecole Normale Super, Dept Math & Applicat, Paris, France
来源
WAVELETS AND SPARSITY XVII | 2017年 / 10394卷
关键词
Blind deconvolution; l(1)-minimization; Compressed sensing; Convex programming; Schur complement; ESPIRIT;
D O I
10.1117/12.2272836
中图分类号
O43 [光学];
学科分类号
070207 ; 0803 ;
摘要
This note studies the recovery of multiple sparse signals, x(n) is an element of R-L, n = 1, . . .N, from the knowledge of their convolution with an unknown point spread function h is an element of R-L. When the point spread function is known to be nonzero, vertical bar h[k]vertical bar > 0, this blind deconvolution problem can be relaxed into a linear, ill-posed inverse problem in the vector concatenating the unknown inputs x(n) together with the inverse of the filter, d is an element of R-L where d[k] = 1/h [k]. When prior information is given on the input subspaces, the resulting overdetermined linear system can be solved efficiently via least squares ( see Ling et al. 2016(1)). When no information is given on those subspaces, and the inputs are only known to be sparse, it still remains possible to recover these inputs along with the filter by considering an additional l(1) penalty. This note certifies exact recovery of both the unknown PSF and unknown sparse inputs, from the knowledge of their convolutions, as soon as the number of inputs N and the dimension of each input, L, satisfy L greater than or similar to N and N greater than or similar to T-max(2), up to log factors. Here T-max = max(n){T} and T-n, n = 1, . . . , N denote the supports of the inputs x(n). Our proof system combines the recent results on blind deconvolution via least squares to certify invertibility of the linear map encoding the convolutions, with the construction of a dual certificate following the structure first suggested in Candes et al. 2007.(2) Unlike in these papers, however, it is not possible to rely on the norm parallel to(A*(T)A(T))(-1)parallel to to certify recovery. We instead use a combination of the Schur Complement and Neumann series to compute an expression for the inverse (A*(T)A(T))(-1). Given this expression, it is possible to show that the poorly scaled blocks in (A*(T)A(T))(-1) are multiplied by the better scaled ones or vanish in the construction of the certificate. Recovery is certified with high probablility on the choice of the supports and distribution of the signs of each input x(n) on the support. The paper follows the line of previous work by Wang et al. 2016(3) where the authors guarantee recovery for subgaussian x Bernoulli inputs satisfying Ex(n) [k] is an element of [1/10,1] as soon as N greater than or similar to L. Examples of applications include seismic imaging with unknown source or marine seismic data deghosting, magnetic resonance autocalibration or multiple channel estimation in communication. Numerical experiments are provided along with a discussion on the sample complexity tightness.
引用
收藏
页数:18
相关论文
共 50 条
  • [21] On the Global Geometry of Sphere-Constrained Sparse Blind Deconvolution
    Zhang, Yuqian
    Lau, Yenson
    Kuo, Han-Wen
    Cheung, Sky
    Pasupathy, Abhay
    Wright, John
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2021, 43 (03) : 999 - 1008
  • [22] MULTIPLE FRAME PROJECTION BASED BLIND DECONVOLUTION
    LAW, NF
    NGUYEN, DT
    ELECTRONICS LETTERS, 1995, 31 (20) : 1734 - 1735
  • [23] Approach based on colored character to blind deconvolution for speech signals
    Cong, FY
    Jia, P
    Ji, SL
    Shi, XZ
    Wang, ZH
    Chen, CH
    PROCEEDINGS OF THE 2004 INTERNATIONAL CONFERENCE ON INTELLIGENT MECHATRONICS AND AUTOMATION, 2004, : 397 - 399
  • [24] Exact Recovery of Multichannel Sparse Blind Deconvolution via Gradient Descent
    Qu, Qing
    Li, Xiao
    Zhu, Zhihui
    SIAM JOURNAL ON IMAGING SCIENCES, 2020, 13 (03): : 1630 - 1652
  • [25] Sparse blind deconvolution and demixing through a"" 1,2-minimization
    Flinth, Axel
    ADVANCES IN COMPUTATIONAL MATHEMATICS, 2018, 44 (01) : 1 - 21
  • [26] Rank-Awareness Sparse Blind Deconvolution Using Modulated Input
    Zhang, Jingchao
    Cao, Qian
    Su, Yinuo
    Qiao, Liyan
    CIRCUITS SYSTEMS AND SIGNAL PROCESSING, 2023, 42 (11) : 6684 - 6700
  • [27] Sparse blind deconvolution and demixing through ℓ1,2-minimization
    Axel Flinth
    Advances in Computational Mathematics, 2018, 44 : 1 - 21
  • [28] Convolutional Sparse Learning for Blind Deconvolution and Application on Impulsive Feature Detection
    Du, Zhaohui
    Chen, Xuefeng
    Zhang, Han
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2018, 67 (02) : 338 - 349
  • [29] Maximum Spectral Sparse Entropy Blind Deconvolution for Bearing Fault Diagnosis
    Cai, Binghuan
    Tang, Gang
    IEEE SENSORS JOURNAL, 2024, 24 (05) : 6451 - 6468
  • [30] Blind Speech Deconvolution via Pretrained Polynomial Dictionary and Sparse Representation
    Guan, Jian
    Wang, Xuan
    Qi, Shuhan
    Dong, Jing
    Wang, Wenwu
    ADVANCES IN MULTIMEDIA INFORMATION PROCESSING - PCM 2017, PT I, 2018, 10735 : 411 - 420