DBSwin: Transformer based dual branch network for single image deraining

被引:0
|
作者
Tan, Fuxiang [1 ,2 ,3 ]
Qian, Yurong [1 ,2 ,3 ]
Kong, Yuting [1 ,2 ,3 ]
Zhang, Hao [1 ,2 ,3 ]
Zhou, Daxin [1 ,2 ,3 ]
Fan, Yingying [1 ,2 ,3 ]
Chen, Long [1 ,2 ,3 ]
Xiao, Zhengqing [1 ,2 ,3 ]
机构
[1] Xinjiang Univ, Sch Software, Urumqi, Peoples R China
[2] Xinjiang Univ, Key Lab Software Engn, Urumqi, Peoples R China
[3] Key Lab Signal Detect & Proc Xinjiang Uygur Auton, Urumqi, Peoples R China
基金
美国国家科学基金会;
关键词
Deraining; transformer; deep learning; image processing; channel attention;
D O I
10.3233/JIFS-220055
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Rain streaks severely affect the perception of the content and structure of an image so high-performance deraining algorithms are needed in order to eliminate the effects of various rain streaks for high-level computer vision tasks. Although much progress has been made with existing deraining methods, the task of single image deraining remains challenging. In this paper, we first point out that existing Transformers lack sufficient ability to capture channel attention which restricted the ability of models in deraining. To improve the performance of deraining model, we propose a dual branch deraining network based on Transformer. One branch uses dense connections to connect Transformer modules which embed the attention of a composite channel. This branch captures channel attention more finely to learn the representation of rain streaks features. The other branch first obtains features at different scales by gradually expanding the receptive field, then uses these features to obtain attention for regional features, and finally uses the attention to guide the model to focus on areas of high rain streaks density and large scales. By fusing these two branches, the model is able to capture channel attention more finely and to focus on regions of high rain streaks density and large scales. The extensive experimental results on synthetic and real datasets demonstrate that the proposed method outperforms most advanced deraining methods.
引用
收藏
页码:5109 / 5123
页数:15
相关论文
共 50 条
  • [1] Single Image Deraining Using Dual Branch Network Based on Attention Mechanism for IoT
    Wang, Di
    Wei, Bingcai
    Zhang, Liye
    CMES-COMPUTER MODELING IN ENGINEERING & SCIENCES, 2023, 137 (02): : 1989 - 2000
  • [2] Channel Pyramidal Transformer Network for Single Image Deraining
    Xu, Yifei
    Long, Zourong
    Tang, Bin
    Lei, Siyue
    IEEE SIGNAL PROCESSING LETTERS, 2023, 30 : 1757 - 1761
  • [3] A Single Image Deraining Algorithm Based on Swin Transformer
    Gao T.
    Wen Y.
    Chen T.
    Zhang J.
    Shanghai Jiaotong Daxue Xuebao/Journal of Shanghai Jiaotong University, 2023, 57 (05): : 613 - 623
  • [4] Multi-scale Channel Transformer Network for Single Image Deraining
    Namba, Yuto
    Han, Xian-Hua
    PROCEEDINGS OF THE 4TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA IN ASIA, MMASIA 2022, 2022,
  • [5] Rain removal method for single image of dual-branch joint network based on sparse transformer
    Qin, Fangfang
    Jia, Zongpu
    Pang, Xiaoyan
    Zhao, Shan
    COMPLEX & INTELLIGENT SYSTEMS, 2025, 11 (01)
  • [6] DPCN: Dual Path Convolutional Network for Single Image Deraining
    Zhang, Wenhao
    Zhou, Yue
    Duan, Shukai
    Hu, Xiaofang
    PRICAI 2022: TRENDS IN ARTIFICIAL INTELLIGENCE, PT III, 2022, 13631 : 310 - 324
  • [7] Alternating attention Transformer for single image deraining
    Yang, Dawei
    He, Xin
    Zhang, Ruiheng
    DIGITAL SIGNAL PROCESSING, 2023, 141
  • [8] RainFormer: a pyramid transformer for single image deraining
    Hao Yang
    Dongming Zhou
    Jinde Cao
    Qian Zhao
    Miao Li
    The Journal of Supercomputing, 2023, 79 : 6115 - 6140
  • [9] RainFormer: a pyramid transformer for single image deraining
    Yang, Hao
    Zhou, Dongming
    Cao, Jinde
    Zhao, Qian
    Li, Miao
    JOURNAL OF SUPERCOMPUTING, 2023, 79 (06): : 6115 - 6140
  • [10] MLTDNet: an efficient multi-level transformer network for single image deraining
    Gao, Feng
    Mu, Xiangyu
    Ouyang, Chao
    Yang, Kai
    Ji, Shengchang
    Guo, Jie
    Wei, Haokun
    Wang, Nan
    Ma, Lei
    Yang, Biao
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (16): : 14013 - 14027