Learning Slimming SAR Ship Object Detector Through Network Pruning and Knowledge Distillation

被引:82
|
作者
Chen, Shiqi [1 ]
Zhan, Ronghui [1 ]
Wang, Wei [1 ]
Zhang, Jun [1 ]
机构
[1] Natl Univ Def Technol, Coll Elect Sci, Natl Key Lab Sci & Technol Automat Target Recogni, Changsha 410073, Peoples R China
基金
中国国家自然科学基金;
关键词
Attention mechanism; feature imitation; knowledge distillation (KD); lightweight synthetic aperture radar (SAR) ship detector; network pruning;
D O I
10.1109/JSTARS.2020.3041783
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
The deployment of deep convolutional neural networks (CNNs) in synthetic aperture radar (SAR) ship real-time detection is largely hindered by huge computational cost. In this article, we propose a novel learning scheme for training a lightweight ship detector called Tiny YOLO-Lite, which simultaneously 1) reduces the model storage size; 2) decreases the floating point operations (FLOPs) calculation; and 3) guarantees the high accuracy with faster speed. This is achieved by self-designed backbone structure and network pruning, which enforces channel-level sparsity in the backbone network and yields a compact model. In addition, knowledge distillation is also applied to make up for the performance decline caused by network pruning. Hereinto, we propose to let small student model mimic cumbersome teacher's output to achieve improved generalization. Rather than applying vanilla full feature imitation, we redefine the distilled knowledge as the inter-relationship between different levels of feature maps and then transfer it from the large network to a smaller one. On account that the detectors should focus more on the salient regions containing ships while background interference is overwhelming, a novel attention mechanism is designed and then attached to the distilled feature for enhanced representation. Finally, extensive experiments are conducted on SSDD, HRSID, and two large-scene SAR images to verify the effectiveness of the thinner SAR ship object detector in comparison of with other CNN-based algorithms. The detection results demonstrate that the proposed detector can achieve lighter architecture with 2.8-M model size, more efficient inference (>$200 fps) with low computation cost, and more accurate prediction with knowledge transfer strategy.
引用
收藏
页码:1267 / 1282
页数:16
相关论文
共 50 条
  • [31] NEURAL NETWORK PRUNING THROUGH CONSTRAINED REINFORCEMENT LEARNING
    Malik, Shehryar
    Haider, Muhammad Umair
    Iqbal, Omer
    Taj, Murtaza
    2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 3027 - 3033
  • [32] AN EFFICIENT ALTERNATIVE TO NETWORK PRUNING THROUGH ENSEMBLE LEARNING
    Poellot, Martin
    Zhang, Rui
    Kaup, Andre
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 4027 - 4031
  • [33] Student Network Learning via Evolutionary Knowledge Distillation
    Zhang, Kangkai
    Zhang, Chunhui
    Li, Shikun
    Zeng, Dan
    Ge, Shiming
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2022, 32 (04) : 2251 - 2263
  • [34] A Lightweight Object Counting Network Based on Density Map Knowledge Distillation
    Shen, Zhilong
    Li, Guoquan
    Xia, Ruiyang
    Meng, Hongying
    Huang, Zhengwen
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2025, 35 (02) : 1492 - 1505
  • [35] Feature Pruning and Recovery Learning with Knowledge Distillation for Occluded Person Re-Identification
    Hou, Mengyu
    Gan, Wenjun
    PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2024, PT VIII, 2025, 15038 : 339 - 353
  • [36] Knowledge Distillation based Compact Model Learning Method for Object Detection
    Ko, Jong Gook
    Yoo, Wonyoung
    11TH INTERNATIONAL CONFERENCE ON ICT CONVERGENCE: DATA, NETWORK, AND AI IN THE AGE OF UNTACT (ICTC 2020), 2020, : 1276 - 1278
  • [37] Scalability of knowledge distillation in incremental deep learning for fast object detection
    Yuwono, Elizabeth Irenne
    Tjondonegoro, Dian
    Sorwar, Golam
    Alaei, Alireza
    APPLIED SOFT COMPUTING, 2022, 129
  • [38] LABEL AUGMENTATION NETWORK BASED ON SELF-DISTILLATION FOR SAR SHIP DETECTION IN COMPLEX BACKGROUND
    Qin, Chuan
    Wang, Xueqian
    Li, Gang
    IGARSS 2023 - 2023 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, 2023, : 5842 - 5845
  • [39] A Mixed-Scale Self-Distillation Network for Accurate Ship Detection in SAR Images
    Liu, Shuang
    Li, Dong
    Jiang, Renjie
    Liu, Qinghua
    Wan, Jun
    Yang, Xiaopeng
    Liu, Hehao
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2023, 16 : 9843 - 9857
  • [40] A Novel Multidimensional Domain Deep Learning Network for SAR Ship Detection
    Li, Dong
    Liang, Quanhuan
    Liu, Hongqing
    Liu, Qinghua
    Liu, Haijun
    Liao, Guisheng
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60