AFENet: Attention-guided feature enhancement network and a benchmark for low-altitude UAV sewage outfall detection

被引:2
作者
Huang, Qingsong [1 ,2 ]
Fan, Junqing [1 ]
Xu, Haoran [1 ]
Han, Wei [1 ]
Huang, Xiaohui [1 ]
Chen, Yunliang [1 ]
机构
[1] China Univ Geosci, Sch Comp Sci, Wuhan 430070, Hubei, Peoples R China
[2] Int Res Ctr Big Data Sustainable Dev Goals, Beijing 100094, Peoples R China
基金
中国国家自然科学基金;
关键词
UAV; River inspection; Sewage outfalls; Attention mechanism; Object detection; PLUMES;
D O I
10.1016/j.array.2024.100343
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Inspecting sewage outfall into rivers is significant to the precise management of the ecological environment because they are the last gate for pollutants to enter the river. Unmanned Aerial Vehicles (UAVs) have the characteristics of maneuverability and high -resolution images and have been used as an important means to inspect sewage outfalls. UAVs are widely used in daily sewage outfall inspections, but relying on manual interpretation lacks the corresponding low -altitude sewage outfall images dataset. Meanwhile, because of the sparse spatial distribution of sewage outfalls, problems like less labeled sample data, complex background types, and weak objects are also prominent. In order to promote the inspection of sewage outfalls, this paper proposes a low -attitude sewage outfall object detection dataset, namely UAV-SOD, and an attentionguided feature enhancement network, namely AFENet. The UAV-SOD dataset features high resolution, complex backgrounds, and diverse objects. Some of the outfall objects are limited by multi -scale, single -colored, and weak feature responses, leading to low detection accuracy. To localize these objects effectively, AFENet first uses the global context block (GCB) to jointly explore valuable global and local information, and then the region of interest (RoI) attention module (RAM) is used to explore the relationships between RoI features. Experimental results show that the proposed method improves detection performance on the proposed UAV-SOD dataset than representative state-of-the-art two -stage object detection methods.
引用
收藏
页数:14
相关论文
共 71 条
[1]   UAV-based remote sensing for the petroleum industry and environmental monitoring: State-of-the-art and perspectives [J].
Asadzadeh, Saeid ;
de Oliveira, Wilson Jose ;
de Souza Filho, Carlos Roberto .
JOURNAL OF PETROLEUM SCIENCE AND ENGINEERING, 2022, 208
[2]   Image augmentation to improve construction resource detection using generative adversarial networks, cut-and-paste, and image transformation techniques [J].
Bang, Seongdeok ;
Baek, Francis ;
Park, Somin ;
Kim, Wontae ;
Kim, Hyoungkwan .
AUTOMATION IN CONSTRUCTION, 2020, 115
[3]   Study of Intensive Anthropogenic Impacts of Submerged Wastewater Discharges on Marine Water Areas Using Satellite Imagery [J].
Bondur, Valery ;
Zamshin, Viktor .
JOURNAL OF MARINE SCIENCE AND ENGINEERING, 2022, 10 (11)
[4]   Cascade R-CNN: High Quality Object Detection and Instance Segmentation [J].
Cai, Zhaowei ;
Vasconcelos, Nuno .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2021, 43 (05) :1483-1498
[5]   GCNet: Non-local Networks Meet Squeeze-Excitation Networks and Beyond [J].
Cao, Yue ;
Xu, Jiarui ;
Lin, Stephen ;
Wei, Fangyun ;
Hu, Han .
2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW), 2019, :1971-1980
[6]  
Carion Nicolas, 2020, Computer Vision - ECCV 2020. 16th European Conference. Proceedings. Lecture Notes in Computer Science (LNCS 12346), P213, DOI 10.1007/978-3-030-58452-8_13
[7]  
Chen C., 2021, IEEE, P1768
[8]   RRNet: A Hybrid Detector for Object Detection in Drone-captured Images [J].
Chen, Changrui ;
Zhang, Yu ;
Lv, Qingxuan ;
Wei, Shuo ;
Wang, Xiaorui ;
Sun, Xin ;
Dong, Junyu .
2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW), 2019, :100-108
[9]   Spatial variability in melting on Himalayan debris-covered glaciers from 2000 to 2013 [J].
Chen, Fang ;
Wang, Jinxiao ;
Li, Bin ;
Yang, Aqiang ;
Zhang, Meimei .
REMOTE SENSING OF ENVIRONMENT, 2023, 291
[10]  
Chen K, 2019, Arxiv, DOI arXiv:1906.07155