Reconstructing as-built beam bridge geometry from construction drawings using deep learning-based symbol pose estimation

被引:1
|
作者
Faltin, Benedikt [1 ]
Schoenfelder, Phillip [1 ]
Gann, Damaris [1 ]
Koenig, Markus [1 ]
机构
[1] Ruhr Univ Bochum, Dept Civil & Environm Engn, Univ Str 150, D-44801 Bochum, Germany
关键词
Building information modeling; As-built model generation; Symbol detection; Bridge construction drawings; Deep learning; Computer vision;
D O I
10.1016/j.aei.2024.102808
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Efficient maintenance planning and streamlined inspection for bridges are essential to prevent catastrophic structural failures. Digital Bridge Management Systems (BMS) have the potential to streamline these tasks. However, their effectiveness relies heavily on the availability of accurate digital bridge models, which are currently challenging and costly to create, limiting the widespread adoption of BMS. This study addresses this issue by proposing a computer vision-based process for generating bridge superstructure models from pixel-based construction drawings. We introduce an automatic pipeline that utilizes a deep learning-based symbol pose estimation approach based on Keypoint R-CNN to organize drawing views spatially, implementing parts of the proposed process. By extending the keypoint-based detection approach to simultaneously process multiple object classes with a variable number of keypoints, a single instance of Keypoint R-CNN can be trained for all identified symbols. We conducted an empirical analysis to determine evaluation parameters for the symbol pose estimation approach to evaluate the method's performance and improve the trained model's comparability. Our findings demonstrate promising steps towards efficient bridge modeling, ultimately facilitating maintenance planning and management.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] A deep learning-based approach to facilitate the as-built state recognition of indoor construction works
    Ekanayake, Biyanka
    Fini, Alireza Ahmadian Fard
    Wong, Johnny Kwok Wai
    Smith, Peter
    CONSTRUCTION INNOVATION-ENGLAND, 2024, 24 (04): : 933 - 949
  • [2] Review of Deep Learning-Based Human Pose Estimation
    Lu Jian
    Yang Tengfei
    Zhao Bo
    Wang Hangying
    Luo Maoxin
    Zhou Yanran
    Li Zhe
    LASER & OPTOELECTRONICS PROGRESS, 2021, 58 (24)
  • [3] Deep Learning-based Human Pose Estimation: A Survey
    Zheng, Ce
    Wu, Wenhan
    Chen, Chen
    Yang, Taojiannan
    Zhu, Sijie
    Shen, Ju
    Kehtarnavaz, Nasser
    Shah, Mubarak
    ACM COMPUTING SURVEYS, 2024, 56 (01)
  • [4] Deep Learning-based Pose Estimation for Mobile Manipulator Tasks
    Kim, Hae-Chang
    Yoon, In-Hwan
    Song, Jae-Bok
    Transactions of the Korean Society of Mechanical Engineers, A, 2022, 66 (03): : 1161 - 1166
  • [5] Deep Learning-based Pose Estimation for Mobile Manipulator Tasks
    Kim, Hae-Chang
    Yoon, In-Hwan
    Song, Jae-Bok
    TRANSACTIONS OF THE KOREAN SOCIETY OF MECHANICAL ENGINEERS A, 2021, 45 (12) : 1161 - 1166
  • [6] Automated Translation of Rebar Information from GPR Data into As-Built BIM: A Deep Learning-Based Approach
    Xiang, Zhongming
    Ou, Ge
    Rashidi, Abbas
    COMPUTING IN CIVIL ENGINEERING 2021, 2022, : 374 - 381
  • [7] Deep learning-based pose estimation for African ungulates in zoos
    Hahn-Klimroth, Max
    Kapetanopoulos, Tobias
    Guebert, Jennifer
    Dierkes, Paul Wilhelm
    ECOLOGY AND EVOLUTION, 2021, 11 (11): : 6015 - 6032
  • [8] 3D pose estimation dataset and deep learning-based ergonomic risk assessment in construction
    Fan, Chao
    Mei, Qipei
    Li, Xinming
    AUTOMATION IN CONSTRUCTION, 2024, 164
  • [9] A Novel Deep Transfer Learning-Based Approach for Face Pose Estimation
    Rusia, Mayank Kumar
    Singh, Dushyant Kumar
    Aquib Ansari, Mohd.
    CYBERNETICS AND INFORMATION TECHNOLOGIES, 2024, 24 (02) : 105 - 121
  • [10] Deep learning-based human pose estimation towards artworks classification
    Kutrzynski, Marcin
    Krol, Dariusz
    JOURNAL OF INFORMATION AND TELECOMMUNICATION, 2024, 8 (04) : 470 - 489