ISAR autofocus based on sparsity-driven estimation of translational and rotational motion components

被引:2
|
作者
Hamad, Ahmad [1 ]
Ender, Joachim [1 ]
机构
[1] Univ Siegen, Ctr Sensor Syst ZESS, Siegen, Germany
关键词
ISAR autofocus; Compressed Sensing (CS); Motion estimation; TARGETS;
D O I
10.1117/12.2532398
中图分类号
TP7 [遥感技术];
学科分类号
081102 ; 0816 ; 081602 ; 083002 ; 1404 ;
摘要
Inverse Synthetic Aperture Radar (ISAR) relies on the motion of the observed target to synthetically generate an image. While the motion of the target might be known for controlled turn-table experiments, this is not true for the general case of non-cooperative objects. For The purpose of obtaining a focused high resolution image of the object, the relative motion between the radar and the object must be estimated. The process of estimating the relative motion components based on the received signals only and then compensating them to produce the ISAR image is called ISAR Autofocus. Compressed Sensing (CS) tackles the problem of recovering an unknown signal from fewer measurements than that required by the Shannon's theory. CS assumes that the signal to be recovered is sparse either directly or under some other representation basis. In general, the object's reflectivity distribution is not sparse. However, in some cases, the ISAR measurements can be approximated by the superposition of the echoes of a group of scattering centers. This interpretation of ISAR images allows for the use of CS algorithms for the reconstruction of ISAR images. In this paper, we propose a Compressed Sensing (CS) based algorithm for estimating the relative motion between the radar and the object as a first step for focusing ISAR images. We verify, with simulated data, the ability of the proposed algorithm to estimate both the relative translational and rotational motion of the observed object with respect to the radar. Future work will test the performance of the algorithm with real data as well as modify the algorithm to work with three dimensional motion components.
引用
收藏
页数:7
相关论文
共 50 条
  • [1] Three Dimensional ISAR Autofocus based on Sparsity Driven Motion Estimation
    Hamad, Ahmad
    Ender, Joachim
    2020 21ST INTERNATIONAL RADAR SYMPOSIUM (IRS 2020), 2020, : 51 - 56
  • [2] Sparsity-driven Autofocus for Multipass SAR Tomography
    Muirhead, F.
    Mulgrew, B.
    Woodhouse, I. H.
    Greig, D.
    SAR IMAGE ANALYSIS, MODELING, AND TECHNIQUES XV, 2015, 9642
  • [3] Sparsity-Driven Stripmap SAR Imaging and Phase Error Estimation Based on Phase Curvature Autofocus
    Yu, Deshui
    Zhu, Ziyi
    Zhang, Jingjing
    Song, Yufan
    Bi, Hui
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2024, 21
  • [4] Sparsity-Driven ISAR Imaging Based on Two-Dimensional ADMM
    Hashempour, Hamid Reza
    IEEE SENSORS JOURNAL, 2020, 20 (22) : 13349 - 13356
  • [5] Structured sparsity-driven autofocus algorithm for high-resolution radar imagery
    Zhao, Lifan
    Wang, Lu
    Bi, Guoan
    Li, Shenghong
    Yang, Lei
    Zhang, Haijian
    SIGNAL PROCESSING, 2016, 125 : 376 - 388
  • [6] Sparsity-Driven ISAR Imaging via Hierarchical Channel-Mixed Framework
    Liang, Jiadian
    Wei, Shunjun
    Wang, Mou
    Shi, Jun
    Zhang, Xiaoling
    IEEE SENSORS JOURNAL, 2021, 21 (17) : 19222 - 19235
  • [7] Clustered Sparsity-Driven SAR Imaging and Autofocus Algorithm in Structured Phase-Noisy Environments
    Yang, Yue
    Zhang, Xuejing
    Gui, Guan
    Wan, Qun
    IEEE ACCESS, 2019, 7 : 70200 - 70211
  • [8] Translational motion estimation with multistatic ISAR systems
    Testa, Alejandro
    Santi, Fabrizio
    Pastina, Debora
    2021 21ST INTERNATIONAL RADAR SYMPOSIUM (IRS), 2021,
  • [9] Sparsity-driven bandwidth factorisation autofocus of high-resolution squint SAR imagery reconstructed by FFBP
    Wang, Xin
    Sun, Xiaoxiao
    JOURNAL OF ENGINEERING-JOE, 2019, 2019 (19): : 5994 - 5998
  • [10] An autofocus algorithm for ISAR based on the maximum likelihood estimation
    ATR Key lab, National Univ. of Defense Technology, Changsha 410073, China
    Guofang Keji Daxue Xuebao, 2006, 5 (63-67):