Bilinear Compressed Sensing Under Known Signs via Convex Programming

被引:3
|
作者
Aghasi, Alireza [1 ]
Ahmed, Ali [2 ]
Hand, Paul [3 ,4 ]
Joshi, Babhru [5 ]
机构
[1] Georgia State Univ, J Mack Robinson Coll Business, Atlanta, GA 30302 USA
[2] ITU, Dept Elect Engn, Lahore 54000, Pakistan
[3] Northeastern Univ, Dept Math, Boston, MA 02115 USA
[4] Northeastern Univ, Khoury Coll Comp Sci, Boston, MA 02115 USA
[5] Univ British Columbia, Dept Math, Vancouver, BC V6T 1Z4, Canada
基金
美国国家科学基金会;
关键词
Inverse problems; deconvolution; blind source separation; compressed sensing; optimization; PHASE RETRIEVAL; BLIND DECONVOLUTION; SPARSE; ALGORITHMS; RECOVERY;
D O I
10.1109/TSP.2020.3017929
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
We consider the bilinear inverse problem of recovering two vectors, x is an element of R-L and w is an element of R-L, from their entrywise product. We consider the case where x and w have known signs and are sparse with respect to known dictionaries of size K and N, respectively. Here, K andN may be larger than, smaller than, or equal to L. We introduce l(1)-BranchHull, which is a convex program posed in the natural parameter space and does not require an approximate solution or initialization in order to be stated or solved. Under the assumptions that x and w satisfy a comparable-effective-sparsity condition and are S-1 - and S-2-sparsewith respect to a random dictionary, we present a recovery guarantee in a noisy case. We show that l(1)-BranchHull is robust to small dense noise with high probability if the number of measurements satisfy L >= Omega((S-1 + S-2) log(2) (K + N)). Numerical experiments show that the scaling constant in the theorem is not too large. We also introduce variants of l(1)-BranchHull for the purposes of tolerating noise and outliers, and for the purpose of recovering piecewise constant signals. We provide an ADMM implementation of these variants and show they can extract piecewise constant behavior from real images.
引用
收藏
页码:6366 / 6379
页数:14
相关论文
共 50 条
  • [31] Perfect Secrecy via Compressed Sensing
    Mayiami, Mahmoud Ramezani
    Seyfe, Babak
    Bafghi, Hamid G.
    2013 IRAN WORKSHOP ON COMMUNICATION AND INFORMATION THEORY (IWCIT), 2013,
  • [32] Rational sphere maps, linear programming, and compressed sensing
    D’Angelo J.P.
    Grundmeier D.
    Lebl J.
    Complex Analysis and its Synergies, 2020, 6 (1)
  • [33] One-Bit Compressed Sensing by Linear Programming
    Plan, Yaniv
    Vershynin, Roman
    COMMUNICATIONS ON PURE AND APPLIED MATHEMATICS, 2013, 66 (08) : 1275 - 1297
  • [34] Network Tomography via Compressed Sensing
    Firooz, Mohammad H.
    Roy, Sumit
    2010 IEEE GLOBAL TELECOMMUNICATIONS CONFERENCE GLOBECOM 2010, 2010,
  • [35] On the image encryption via compressed sensing
    Zhang, Gesen
    Jiao, Shuhong
    Xu, Xiaoli
    ICIC Express Letters, 2011, 5 (01): : 261 - 266
  • [36] Convex Combination of Compressed Sensing Algorithms for Multiple Measurement Vectors
    Bapat, Ketan Atul
    Shashank, S.
    Chakraborty, Mrityunjoy
    2024 INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING AND COMMUNICATIONS, SPCOM 2024, 2024,
  • [37] On the perturbation of measurement matrix in non-convex compressed sensing
    Ince, Taner
    Nacaroglu, Arif
    SIGNAL PROCESSING, 2014, 98 : 143 - 149
  • [38] Compressed Sensing under Optimal Quantization
    Kipnis, Alon
    Reeves, Galen
    Eldar, Yonina C.
    Goldsmith, Andrea J.
    2017 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2017, : 2148 - 2152
  • [39] Superreplication of Financial Derivatives via Convex Programming
    Kahale, Nabil
    MANAGEMENT SCIENCE, 2017, 63 (07) : 2323 - 2339
  • [40] MINIMIZING CONDITION NUMBER VIA CONVEX PROGRAMMING
    Lu, Zhaosong
    Pong, Ting Kei
    SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2011, 32 (04) : 1193 - 1211