Vision-Based Construction Worker Activity Analysis Informed by Body Posture

被引:73
作者
Roberts, Dominic [1 ]
Torres Calderon, Wilfredo [2 ]
Tang, Shuai [2 ]
Golparvar-Fard, Mani [3 ]
机构
[1] Univ Illinois, Dept Comp Sci, Urbana, IL 61801 USA
[2] Univ Illinois, Dept Civil & Environm Engn, Urbana, IL 61801 USA
[3] Univ Illinois, Dept Civil & Environm Engn & Comp Sci, Urbana, IL 61801 USA
基金
美国国家科学基金会;
关键词
ACTION RECOGNITION; EARTHMOVING EXCAVATORS; POSE ESTIMATION; PRODUCTIVITY; TRACKING; RESOURCES; EQUIPMENT; FEATURES; CONTEXT; SYSTEM;
D O I
10.1061/(ASCE)CP.1943-5487.0000898
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Activity analysis of construction resources is generally performed by manually observing construction operations either in person or through recorded videos. It is thus prone to observer fatigue and bias and is of limited scalability and cost-effectiveness. Automating this procedure obviates these issues and can allow project teams to focus on performance improvement. This paper introduces a novel deep learning- and vision-based activity analysis framework that estimates and tracks two-dimensional (2D) worker pose and outputs per-frame worker activity labels given input red-green-blue (RGB) video footage of a construction worker operation. We used 317 annotated videos of bricklaying and plastering operations to train and validate the proposed method. This method obtained 82.6% mean average precision (mAP) for pose estimation and 72.6% multiple-object tracking accuracy (MOTA), and 81.3% multiple-object tracking precision (MOTP) for pose tracking. Cross-validation activity analysis accuracy of 78.5% was also obtained. We show that worker pose contributes to activity analysis results. This highlights the potential for using vision-based ergonomics assessment methods that rely on pose in conjunction with the proposed method for assessing the ergonomic viability of individual activities. (c) 2020 American Society of Civil Engineers.
引用
收藏
页数:17
相关论文
共 85 条
[51]  
Liu MY, 2017, COMPUTING IN CIVIL ENGINEERING 2017: SMART SAFETY, SUSTAINABILITY, AND RESILIENCE, P141
[52]   Optical marker-based end effector pose estimation for articulated excavators [J].
Lundeen, Kurt M. ;
Dong, Suyang ;
Fredricks, Nicholas ;
Akula, Manu ;
Seo, Jongwon ;
Kamat, Vineet R. .
AUTOMATION IN CONSTRUCTION, 2016, 65 :51-64
[53]   Vision-based detection and visualization of dynamic workspaces [J].
Luo, Xiaochun ;
Li, Heng ;
Wang, Hao ;
Wu, Zezhou ;
Dai, Fei ;
Cao, Dongping .
AUTOMATION IN CONSTRUCTION, 2019, 104 :1-13
[54]   Capturing and Understanding Workers' Activities in Far-Field Surveillance Videos with Deep Action Recognition and Bayesian Nonparametric Learning [J].
Luo, Xiaochun ;
Li, Heng ;
Yang, Xincong ;
Yu, Yantao ;
Cao, Dongping .
COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, 2019, 34 (04) :333-351
[55]   Recognizing Diverse Construction Activities in Site Images via Relevance Networks of Construction-Related Objects Detected by Convolutional Neural Networks [J].
Luo, Xiaochun ;
Li, Heng ;
Cao, Dongping ;
Dai, Fei ;
Seo, JoonOh ;
Lee, SangHyun .
JOURNAL OF COMPUTING IN CIVIL ENGINEERING, 2018, 32 (03)
[56]  
Memarzadeh M., 2012, Proceedings of the 2012 ASCE International Conference on Computing in Civil Engineering, P429, DOI 10.1061/9780784412343.0054
[57]  
Milan A., 2016, PREPRINT
[58]   Vision-Based Framework for Intelligent Monitoring of Hardhat Wearing on Construction Sites [J].
Mneymneh, Bahaa Eddine ;
Abbas, Mohamad ;
Khoury, Hiam .
JOURNAL OF COMPUTING IN CIVIL ENGINEERING, 2019, 33 (02)
[59]   Automated Hardhat Detection for Construction Safety Applications [J].
Mneymneh, Bahaa Eddine ;
Abbas, Mohamad ;
Khoury, Hiam .
CREATIVE CONSTRUCTION CONFERENCE 2017, CCC 2017, 2017, 196 :895-902
[60]  
Roberts D., INT C SMART INFRASTR, DOI DOI 10.1680/ICSIC.64669.307