Burned Area and Burn Severity Mapping With a Transformer-Based Change Detection Model

被引:2
|
作者
Han, Yuxin [1 ,2 ,3 ]
Zheng, Change [1 ,2 ,3 ]
Liu, Xiaodong [1 ,2 ,3 ]
Tian, Ye [1 ,2 ,3 ,4 ]
Dong, Zixun [1 ,2 ,3 ]
机构
[1] Beijing Forestry Univ, Sch Technol, Beijing 100083, Peoples R China
[2] State Key Lab Efficient Prod Forest Resources, Beijing 100083, Peoples R China
[3] Natl Forestry & Grassland Adm Forestry Equipment &, Key Lab, Beijing 100083, Peoples R China
[4] Beijing Forestry Univ, Sch Ecol & Nat Conservat, Beijing 100083, Peoples R China
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
Vegetation mapping; Forestry; Transformers; Feature extraction; Accuracy; Remote sensing; Indexes; Burned area; burn severity; change detection; deep learning (DL); SPECTRAL INDEXES; VEGETATION; RECOVERY; VERSION;
D O I
10.1109/JSTARS.2024.3435857
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Forest fires are significant disturbances to ecosystems, necessitating accurate mapping of burned areas and assessment of burn severity. First, we reconstruct a dataset whose label uses a more flexible classification method from Landsat imagery and establish auxiliary environmental datasets for fire-affected regions. Leveraging vegetation change prefire and postfire, we propose a transformer-based change detection model that integrates remote sensing and environmental information effectively. We introduce a multilevel feature fusion mechanism to address spatial resolution degradation in burn severity estimation. Experimental results show our model closely approximates evaluation dataset labels. For burned area segmentation, our method achieves the highest F1 (0.897) and mIoU of 0.781. For burn severity estimation, our method also achieves the highest mIoU (0.851). Incorporating auxiliary features improves performance by nearly 30%, while the multilevel feature fusion mechanism reduces resolution degradation by 9.6%.
引用
收藏
页码:13866 / 13880
页数:15
相关论文
共 50 条
  • [31] CityTransformer: A Transformer-Based Model for Contaminant Dispersion Prediction in a Realistic Urban Area
    Asahi, Yuuichi
    Onodera, Naoyuki
    Hasegawa, Yuta
    Shimokawabe, Takashi
    Shiba, Hayato
    Idomura, Yasuhiro
    BOUNDARY-LAYER METEOROLOGY, 2023, 186 (03) : 659 - 692
  • [32] A New Model for Transfer Learning-Based Mapping of Burn Severity
    Zheng, Zhong
    Wang, Jinfei
    Shan, Bo
    He, Yongjun
    Liao, Chunhua
    Gao, Yanghua
    Yang, Shiqi
    REMOTE SENSING, 2020, 12 (04)
  • [33] A Transformer-Based Framework for Tiny Object Detection
    Liao, Yi-Kai
    Lin, Gong-Si
    Yeh, Mei-Chen
    2023 ASIA PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE, APSIPA ASC, 2023, : 373 - 377
  • [34] Transformer-based models for multimodal irony detection
    Tomás D.
    Ortega-Bueno R.
    Zhang G.
    Rosso P.
    Schifanella R.
    Journal of Ambient Intelligence and Humanized Computing, 2023, 14 (6) : 7399 - 7410
  • [35] A Generalized Transformer-Based Pulse Detection Algorithm
    Dematties, Dario
    Wen, Chenyu
    Zhang, Shi-Li
    ACS SENSORS, 2022, 7 (09) : 2710 - 2720
  • [36] Survey of Transformer-Based Object Detection Algorithms
    Li, Jian
    Du, Jianqiang
    Zhu, Yanchen
    Guo, Yongkun
    Computer Engineering and Applications, 2023, 59 (10) : 48 - 64
  • [37] Transformer-based mass detection in digital mammograms
    Betancourt Tarifa A.S.
    Marrocco C.
    Molinara M.
    Tortorella F.
    Bria A.
    Journal of Ambient Intelligence and Humanized Computing, 2023, 14 (03) : 2723 - 2737
  • [38] BlinkLinMulT: Transformer-Based Eye Blink Detection
    Fodor, Adam
    Fenech, Kristian
    Lorincz, Andras
    JOURNAL OF IMAGING, 2023, 9 (10)
  • [39] Transformer-Based Intrusion Detection for IoT Networks
    Akuthota, Uday Chandra
    Bhargava, Lava
    IEEE INTERNET OF THINGS JOURNAL, 2025, 12 (05): : 6062 - 6067
  • [40] A transformer-based approach to irony and sarcasm detection
    Rolandos Alexandros Potamias
    Georgios Siolas
    Andreas - Georgios Stafylopatis
    Neural Computing and Applications, 2020, 32 : 17309 - 17320