Generating real-world-like labelled synthetic datasets for construction site applications

被引:12
|
作者
Barrera-Animas, Ari Yair [1 ]
Delgado, Juan Manuel Davila [1 ]
机构
[1] Univ West England, Bristol Business Sch, Big Data Enterprise & Artificial Intelligent Lab B, Bristol, England
基金
“创新英国”项目;
关键词
Object detection; Synthetic dataset; Construction field; Auto-annotation;
D O I
10.1016/j.autcon.2023.104850
中图分类号
TU [建筑科学];
学科分类号
0813 ;
摘要
Having synthetic image generation and automatic labelling as two separate processes remains one of the main limitations of automatic generation of large real-world synthetic datasets. To overcome this drawback, a methodology to perform both tasks in a simultaneous and automatic manner is proposed. To resemble real -world scenarios, a diverse set of rendering configurations of illumination, locations, and sizes are presented. For testing, three synthetic datasets (S, M and SM), oriented to the construction field, were generated. Faster R-CNN, RetinaNet, and YoloV4 detection algorithms were used to independently evaluate the datasets using the COCO evaluation metrics and the PascalVOC Mean Average Accuracy metric. Results show that, in general, the S dataset performed 1.2% better in the evaluation metrics and that the SM dataset obtained better plots of training and validation loss curves in each detector; highlighting the combinational usage of images with single and multiple objects as a better generalisation approach.
引用
收藏
页数:18
相关论文
共 2 条
  • [1] Investigating the optimisation of real-world and synthetic object detection training datasets through the consideration of environmental and simulation factors
    Newman, Callum
    Petzing, Jon
    Goh, Yee Mey
    Justham, Laura
    INTELLIGENT SYSTEMS WITH APPLICATIONS, 2022, 14
  • [2] Systematic Evaluation of Two Classical Receptor Models in Source Apportionment of Soil Heavy Metal(loid) Pollution Using Synthetic and Real-World Datasets
    Hu, Yuanan
    Yang, Sen
    Cheng, Hefa
    Tao, Shu
    ENVIRONMENTAL SCIENCE & TECHNOLOGY, 2022, 56 (24) : 17604 - 17614