Comparative analysis of the performance of selected machine learning algorithms depending on the size of the training sample

被引:1
作者
Kupidura, Przemyslaw [1 ]
Kepa, Agnieszka [1 ]
Krawczyk, Piotr [2 ]
机构
[1] Warsaw Univ Technol, Fac Geodesy & Cartog, Pl Politechniki 1, PL-00661 Warsaw, Poland
[2] Orbitile Ltd, Potulkaly 6B-4, Warsaw, Poland
关键词
efficiency; classification; machine learning; remote sensing; satellite imagery; training sample size; SATELLITE IMAGERY; LAND-COVER; CLASSIFICATION; VARIABLES; LIDAR;
D O I
10.2478/rgg-2024-0015
中图分类号
TP7 [遥感技术];
学科分类号
081102 ; 0816 ; 081602 ; 083002 ; 1404 ;
摘要
The article presents an analysis of the effectiveness of selected machine learning methods: Random Forest (RF), Extreme Gradient Boosting (XGB), and Support Vector Machine (SVM) in the classification of land use and cover in satellite images. Several variants of each algorithm were tested, adopting different parameters typical for each of them. Each variant was classified multiple (20) times, using training samples of different sizes: from 100 pixels to 200,000 pixels. The tests were conducted independently on 3 Sentinel-2 satellite images, identifying 5 basic land cover classes: built-up areas, soil, forest, water, and low vegetation. Typical metrics were used for the accuracy assessment: Cohen's kappa coefficient, overall accuracy (for whole images), as well as F-1 score, precision, and recall (for individual classes). The results obtained for different images were consistent and clearly indicated an increase in classification accuracy with the increase in the size of the training sample. They also showed that among the tested algorithms, the XGB algorithm is the most sensitive to the size of the training sample, while the least sensitive is SVM, which achieved relatively good results even when using training samples of the smallest sizes. At the same time, it was pointed out that while in the case of RF and XGB algorithms the differences between the tested variants were slight, the effectiveness of SVM was very much dependent on the gamma parameter - with too high values of this parameter, the model showed a tendency to overfit, which did not allow for satisfactory results.
引用
收藏
页码:53 / 69
页数:17
相关论文
共 48 条
[1]  
Allwright S., 2023, Technical report
[2]   Random forest in remote sensing: A review of applications and future directions [J].
Belgiu, Mariana ;
Dragut, Lucian .
ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING, 2016, 114 :24-31
[3]   A comparative study of the XGBoost ensemble learning and multilayer perceptron in mineral prospectivity modeling: a case study of the Torud-Chahshirin belt, NE Iran [J].
Bigdeli, Amirreza ;
Maghsoudi, Abbas ;
Ghezelbash, Reza .
EARTH SCIENCE INFORMATICS, 2024, 17 (01) :483-499
[4]  
Boser B. E., 1992, Proceedings of the Fifth Annual ACM Workshop on Computational Learning Theory, P144, DOI 10.1145/130385.130401
[5]   Random forests [J].
Breiman, L .
MACHINE LEARNING, 2001, 45 (01) :5-32
[6]   Random forests [J].
Breiman, L .
MACHINE LEARNING, 2001, 45 (01) :5-32
[7]   Seasonal Trends in Separability of Leaf Reflectance Spectra for Ailanthus altissima and Four Other Tree Species [J].
Burkholder, Aaron ;
Warner, Timothy A. ;
Culp, Mark ;
Landenberger, Rick .
PHOTOGRAMMETRIC ENGINEERING AND REMOTE SENSING, 2011, 77 (08) :793-804
[8]   XGBoost: A Scalable Tree Boosting System [J].
Chen, Tianqi ;
Guestrin, Carlos .
KDD'16: PROCEEDINGS OF THE 22ND ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2016, :785-794
[9]   A COEFFICIENT OF AGREEMENT FOR NOMINAL SCALES [J].
COHEN, J .
EDUCATIONAL AND PSYCHOLOGICAL MEASUREMENT, 1960, 20 (01) :37-46
[10]   SUPPORT-VECTOR NETWORKS [J].
CORTES, C ;
VAPNIK, V .
MACHINE LEARNING, 1995, 20 (03) :273-297