Image-to-Height Domain Translation for Synthetic Aperture Sonar

被引:2
作者
Stewart, Dylan [1 ]
Kreulach, Austin [1 ]
Johnson, Shawn F. F. [2 ]
Zare, Alina [1 ]
机构
[1] Univ Florida, Dept Elect & Comp Engn, Gainesville, FL 32611 USA
[2] Penn State Univ, Appl Res Lab, State Coll, PA 16801 USA
来源
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING | 2023年 / 61卷
关键词
Synthetic aperture sonar; Sonar; Estimation; Data models; Apertures; Sensors; Sea surface; Bathymetry; circular Synthetic Aperture Sonar (cSAS); conditional Generative Adversarial Network (cGAN); domain translation; Gaussian Markov random field (GMRF); pix2pix; SAS; UNet; RECONSTRUCTION;
D O I
10.1109/TGRS.2023.3236473
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
Synthetic aperture sonar (SAS) intensity statistics are dependent upon the sensing geometry at the time of capture. Estimating bathymetry from acoustic surveys is challenging. While several methods have been proposed to estimate seabed relief via intensity, we develop the first large-scale study that relies on deep learning models. In this work, we pose bathymetric estimation from SAS surveys as a domain translation problem of translating intensity to height. Since no dataset of coregistered seabed relief maps and sonar imagery previously existed to learn this domain translation, we produce the first large simulated dataset containing coregistered pairs of seabed relief and intensity maps from two unique sonar data simulation techniques. We apply four types of models, with varying complexity, to translate intensity imagery to seabed relief: a shape-from-shading (SFS) approach, a Gaussian Markov random field (GMRF) approach, a conditional Generative Adversarial Network (cGAN), and UNet architectures. Each model is applied to datasets containing sand ripples, rocky, mixed, and flat sea bottoms. Methods are compared in reference to the coregistered simulated datasets using L1 error. Additionally, we provide results on simulated and real SAS imagery. Our results indicate that the proposed UNet architectures outperform an SFS, a GMRF, and a pix2pix cGAN model.
引用
收藏
页数:13
相关论文
共 50 条
[21]   Resolution Measurement for Synthetic Aperture Sonar [J].
Dillon, Jeremy ;
Charron, Richard .
OCEANS 2019 MTS/IEEE SEATTLE, 2019,
[22]   Low frequency synthetic aperture sonar [J].
Tanaka, T ;
Hama, Y ;
Shiba, H ;
Yamaguchi, I .
NEC RESEARCH & DEVELOPMENT, 2003, 44 (02) :161-164
[23]   Automated Synthetic Aperture Sonar Image Segmentation using Spatially Coherent Clustering [J].
Steele, Shannon-Morgan ;
Ejdrygiewicz, Jillian ;
Dillon, Jeremy .
OCEANS 2021: SAN DIEGO - PORTO, 2021,
[24]   Seafloor Slope Estimation and Its Theoretical Accuracy in Interferometric Synthetic Aperture Sonar [J].
Lorentzen, Ole Jacob ;
Saebo, Torstein Olsmo ;
Austeng, Andreas ;
Hunter, Alan J. ;
Hansen, Roy Edgar .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2025, 63
[25]   Results from the DARPA and ONR synthetic aperture sonar programs [J].
Chatham, RE ;
Nelson, MA ;
Chang, E .
DETECTION AND REMEDIATION TECHNOLOGIES FOR MINES AND MINELIKE TARGETS V, PTS 1 AND 2, 2000, 4038 :422-430
[26]   Fusion Digital Terrain Models for Synthetic Aperture Sonar Interferometry [J].
Lorentzen, Ole J. ;
Hansen, Roy E. .
OCEANS 2019 MTS/IEEE SEATTLE, 2019,
[27]   U-SAS: U-Shape Network With Multilevel Enhancement and Global Decoding for Synthetic Aperture Sonar Image Semantic Segmentation [J].
Li, Jiayuan ;
Wang, Zhen ;
You, Zhuhong ;
Zhao, Zhengyang ;
Yuan, Zhanbin .
IEEE SENSORS JOURNAL, 2025, 25 (01) :1799-1813
[28]   Deep Multi-Look Sequence Processing for Synthetic Aperture Sonar Image Segmentation [J].
Gerg, Isaac D. ;
Monga, Vishal .
IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2023, 61
[29]   Buried object detection with synthetic aperture sonar [J].
Nakamura, Y ;
Yamaguchi, I ;
Tanaka, T ;
Hama, Y .
PROCEEDINGS OF THE 2004 INTERNATIONAL SYMPOSIUM ON UNDERWATER TECHNOLOGY, 2004, :27-31
[30]   BROAD-BAND SYNTHETIC APERTURE SONAR [J].
HAYES, MP ;
GOUGH, PT .
IEEE JOURNAL OF OCEANIC ENGINEERING, 1992, 17 (01) :80-94