Example-based explainable AI and its application for remote sensing image classification

被引:13
作者
Ishikawa, Shin-nosuke [1 ,2 ]
Todo, Masato [2 ]
Taki, Masato [1 ]
Uchiyama, Yasunobu [1 ]
Matsunaga, Kazunari
Lin, Peihsuan [2 ]
Ogihara, Taiki [2 ]
Yasui, Masao [3 ]
机构
[1] Rikkyo Univ, Grad Sch Artificial Intelligence & Sci, Tokyo 1718501, Japan
[2] Mamezou Co Ltd, Strateg Digital Business Unit, Tokyo 1630434, Japan
[3] Mamezou Co Ltd, Tokyo 1630434, Japan
关键词
Machine learning; Deep learning; Explainable artificial intelligence; Remote sensing imagery;
D O I
10.1016/j.jag.2023.103215
中图分类号
TP7 [遥感技术];
学科分类号
081102 ; 0816 ; 081602 ; 083002 ; 1404 ;
摘要
We present a method of explainable artificial intelligence (XAI), "What I Know (WIK)", to provide additional information to verify the reliability of a deep learning model by showing an example of an instance in a training dataset that is similar to the input data to be inferred and demonstrate it in a remote sensing image classification task. One of the expected roles of XAI methods is verifying whether inferences of a trained machine learning model are valid for an application, and it is an important factor that what datasets are used for training the model as well as the model architecture. Our data-centric approach can help determine whether the training dataset is sufficient for each inference by checking the selected example data. If the selected example looks similar to the input data, we can confirm that the model was not trained on a dataset with a feature distribution far from the feature of the input data. With this method, the criteria for selecting an example are not merely data similarity with the input data but also data similarity in the context of the model task. Using a remote sensing image dataset from the Sentinel-2 satellite, the concept was successfully demonstrated with reasonably selected examples. This method can be applied to various machine-learning tasks, including classification and regression.
引用
收藏
页数:6
相关论文
共 19 条
[1]   Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI [J].
Barredo Arrieta, Alejandro ;
Diaz-Rodriguez, Natalia ;
Del Ser, Javier ;
Bennetot, Adrien ;
Tabik, Siham ;
Barbado, Alberto ;
Garcia, Salvador ;
Gil-Lopez, Sergio ;
Molina, Daniel ;
Benjamins, Richard ;
Chatila, Raja ;
Herrera, Francisco .
INFORMATION FUSION, 2020, 58 :82-115
[2]   Remote Sensing Image Scene Classification Meets Deep Learning: Challenges, Methods, Benchmarks, and Opportunities [J].
Cheng, Gong ;
Xie, Xingxing ;
Han, Junwei ;
Guo, Lei ;
Xia, Gui-Song .
IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2020, 13 :3735-3756
[3]   Explainable AI for earth observation: A review including societal and regulatory perspectives [J].
Gevaert, Caroline M. .
INTERNATIONAL JOURNAL OF APPLIED EARTH OBSERVATION AND GEOINFORMATION, 2022, 112
[4]  
Gomez P., 2021, ARXIV, DOI 10.48550/ARXIV.2103.10368
[5]  
Hanawa K, 2021, Arxiv, DOI arXiv:2006.04528
[6]  
Helber P, 2019, Arxiv, DOI arXiv:1709.00029
[7]  
Heusel M, 2017, ADV NEUR IN, V30
[8]   Automatic Detection of Occulted Hard X-Ray Flares Using Deep-Learning Methods [J].
Ishikawa, Shin-nosuke ;
Matsumura, Hideaki ;
Uchiyama, Yasunobu ;
Glesener, Lindsay .
SOLAR PHYSICS, 2021, 296 (02)
[9]   Evaluating explainable artificial intelligence methods for multi-label deep learning classification tasks in remote sensing [J].
Kakogeorgiou, Ioannis ;
Karantzalos, Konstantinos .
INTERNATIONAL JOURNAL OF APPLIED EARTH OBSERVATION AND GEOINFORMATION, 2021, 103
[10]   Deep Learning Classification of Land Cover and Crop Types Using Remote Sensing Data [J].
Kussul, Nataliia ;
Lavreniuk, Mykola ;
Skakun, Sergii ;
Shelestov, Andrii .
IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2017, 14 (05) :778-782