Image-to-Height Domain Translation for Synthetic Aperture Sonar

被引:2
作者
Stewart, Dylan [1 ]
Kreulach, Austin [1 ]
Johnson, Shawn F. F. [2 ]
Zare, Alina [1 ]
机构
[1] Univ Florida, Dept Elect & Comp Engn, Gainesville, FL 32611 USA
[2] Penn State Univ, Appl Res Lab, State Coll, PA 16801 USA
来源
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING | 2023年 / 61卷
关键词
Synthetic aperture sonar; Sonar; Estimation; Data models; Apertures; Sensors; Sea surface; Bathymetry; circular Synthetic Aperture Sonar (cSAS); conditional Generative Adversarial Network (cGAN); domain translation; Gaussian Markov random field (GMRF); pix2pix; SAS; UNet; RECONSTRUCTION;
D O I
10.1109/TGRS.2023.3236473
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
Synthetic aperture sonar (SAS) intensity statistics are dependent upon the sensing geometry at the time of capture. Estimating bathymetry from acoustic surveys is challenging. While several methods have been proposed to estimate seabed relief via intensity, we develop the first large-scale study that relies on deep learning models. In this work, we pose bathymetric estimation from SAS surveys as a domain translation problem of translating intensity to height. Since no dataset of coregistered seabed relief maps and sonar imagery previously existed to learn this domain translation, we produce the first large simulated dataset containing coregistered pairs of seabed relief and intensity maps from two unique sonar data simulation techniques. We apply four types of models, with varying complexity, to translate intensity imagery to seabed relief: a shape-from-shading (SFS) approach, a Gaussian Markov random field (GMRF) approach, a conditional Generative Adversarial Network (cGAN), and UNet architectures. Each model is applied to datasets containing sand ripples, rocky, mixed, and flat sea bottoms. Methods are compared in reference to the coregistered simulated datasets using L1 error. Additionally, we provide results on simulated and real SAS imagery. Our results indicate that the proposed UNet architectures outperform an SFS, a GMRF, and a pix2pix cGAN model.
引用
收藏
页数:13
相关论文
共 50 条
[31]   Relative height estimation by cross-correlating ground-range synthetic aperture sonar images [J].
Saebo, Torstein Olsmo ;
Hansen, Roy Edgar ;
Hanssen, Alfred .
IEEE JOURNAL OF OCEANIC ENGINEERING, 2007, 32 (04) :971-982
[32]   In Situ Array Calibration for Synthetic Aperture Sonar [J].
Dillon, Jeremy ;
Steele, Shannon-Morgan .
GLOBAL OCEANS 2020: SINGAPORE - U.S. GULF COAST, 2020,
[33]   Modified synthetic aperture algorithm for sonar systems [J].
Sawa, T. ;
Aoki, T. ;
Yoshida, H. ;
Hyakudome, T. ;
Ishibashi, S. ;
Matsubara, S. ;
Wright, S. .
INTERNATIONAL JOURNAL OF ADAPTIVE CONTROL AND SIGNAL PROCESSING, 2010, 24 (02) :142-148
[34]   Squint and forward looking Synthetic Aperture Sonar [J].
Caprais, P ;
Guyonic, S .
OCEANS '97 MTS/IEEE CONFERENCE PROCEEDINGS, VOLS 1 AND 2, 1997, :809-814
[35]   Motion compensation on synthetic aperture sonar images [J].
Heremans, R. ;
Acheroy, M. ;
Dupont, Y. .
IMAGE AND SIGNAL PROCESSING FOR REMOTE SENSING XII, 2006, 6365
[36]   Seabed Segmentation in Synthetic Aperture Sonar Images [J].
Cobb, J. Tory ;
Principe, Jose .
DETECTION AND SENSING OF MINES, EXPLOSIVE OBJECTS, AND OBSCURED TARGETS XVI, 2011, 8017
[37]   Classification of objects in synthetic Aperture Sonar images [J].
Marchand, Bradley ;
Saito, Naoki ;
Xiao, Hong .
2007 IEEE/SP 14TH WORKSHOP ON STATISTICAL SIGNAL PROCESSING, VOLS 1 AND 2, 2007, :433-437
[38]   Imaging algorithms for a strip-map synthetic aperture sonar: Minimizing the effects of aperture errors and aperture undersampling [J].
Gough, PT ;
Hawkins, DW .
IEEE JOURNAL OF OCEANIC ENGINEERING, 1997, 22 (01) :27-39
[39]   Design Considerations and Operational Advantages of a Modular AUV with Synthetic Aperture Sonar [J].
Taylor, Mikell ;
Wilby, Andy .
OCEANS 2011, 2011,
[40]   Interested Small Target Detection Method Based on Improved SSD for Synthetic Aperture Sonar Image [J].
Li B.-Q. ;
Huang H.-N. ;
Liu J.-Y. ;
Liu Z.-J. ;
Wei L.-Z. .
Tien Tzu Hsueh Pao/Acta Electronica Sinica, 2024, 52 (03) :762-771