Dual-Awareness Attention for Few-Shot Object Detection

被引:61
|
作者
Chen, Tung-, I [1 ]
Liu, Yueh-Cheng [1 ]
Su, Hung-Ting [1 ]
Chang, Yu-Cheng [1 ]
Lin, Yu-Hsiang [1 ]
Yeh, Jia-Fong [1 ]
Chen, Wen-Chin [1 ]
Hsu, Winston H. [1 ,2 ]
机构
[1] Natl Taiwan Univ, Taipei 106, Taiwan
[2] Mobile Drive Technol, Taipei 236, Taiwan
关键词
Feature extraction; Object detection; Detectors; Correlation; Task analysis; Power capacitors; Adaptation models; Deep learning; object detection; visual attention; few-shot object detection; NETWORKS;
D O I
10.1109/TMM.2021.3125195
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
While recent progress has significantly boosted few-shot classification (FSC) performance, few-shot object detection (FSOD) remains challenging for modern learning systems. Existing FSOD systems follow FSC approaches, ignoring critical issues such as spatial variability and uncertain representations, and consequently result in low performance. Observing this, we propose a novel Dual-Awareness Attention (DAnA) mechanism that enables networks to adaptively interpret the given support images. DAnA transforms support images into query-position-aware (QPA) features, guiding detection networks precisely by assigning customized support information to each local region of the query. In addition, the proposed DAnA component is flexible and adaptable to multiple existing object detection frameworks. By adopting DAnA, conventional object detection networks, Faster R-CNN and RetinaNet, which are not designed explicitly for few-shot learning, reach state-of-the-art performance in FSOD tasks. In comparison with previous methods, our model significantly increases the performance by 47% (+6.9 AP), showing remarkable ability under various evaluation settings.
引用
收藏
页码:291 / 301
页数:11
相关论文
共 50 条
  • [21] Decoupled Metric Network for Single-Stage Few-Shot Object Detection
    Lu, Yue
    Chen, Xingyu
    Wu, Zhengxing
    Yu, Junzhi
    IEEE TRANSACTIONS ON CYBERNETICS, 2023, 53 (01) : 514 - 525
  • [22] Few-Shot Incremental Object Detection in Aerial Imagery via Dual-Frequency Prompt
    Lu, Xiaonan
    Diao, Wenhui
    Li, Junxi
    Zhang, Yidan
    Wang, Peijin
    Sun, Xian
    Fu, Kun
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62 : 1 - 17
  • [23] FSNA: Few-Shot Object Detection via Neighborhood Information Adaption and All Attention
    Zhu, Jinxiang
    Wang, Qi
    Dong, Xinyu
    Ruan, Weijian
    Chen, Haolin
    Lei, Liang
    Hao, Gefei
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (08) : 7121 - 7134
  • [24] Few-Shot Object Detection in Remote Sensing Images With Multiscale Spatial Selective Attention
    Yu, Yingnan
    Chen, Si-Bao
    Huang, Li-Li
    Tang, Jin
    Luo, Bin
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2024, 21
  • [25] A Few-Shot Object Detection Method for Endangered Species
    Yan, Hongmei
    Ruan, Xiaoman
    Zhu, Daixian
    Kong, Haoran
    Liu, Peixuan
    APPLIED SCIENCES-BASEL, 2024, 14 (11):
  • [26] Few-Shot Object Detection with Proposal Balance Refinement
    Kim, Sueyeon
    Nam, Woo-Jeoung
    Lee, Seong-Whan
    2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 4700 - 4707
  • [27] Few-Shot Object Detection via Understanding Convolution and Attention
    Tong, Jiaxing
    Chen, Tao
    Wang, Qiong
    Yao, Yazhou
    PATTERN RECOGNITION AND COMPUTER VISION, PT I, PRCV 2022, 2022, 13534 : 674 - 687
  • [28] Few-Shot Object Detection on Remote Sensing Images
    Li, Xiang
    Deng, Jingyu
    Fang, Yi
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
  • [29] Balanced Orthogonal Subspace Separation Detector for Few-Shot Object Detection in Aerial Imagery
    Jiang, Hongxiang
    Wang, Qixiong
    Feng, Jiaqi
    Zhang, Guangyun
    Yin, Jihao
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62
  • [30] GFENet: Generalization Feature Extraction Network for Few-Shot Object Detection
    Ke, Xiao
    Chen, Qiuqin
    Liu, Hao
    Guo, Wenzhong
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2024, 34 (12) : 12741 - 12755