Self-contained optical-inertial motion capturing for assembly planning in digital factory

被引:13
作者
Fang, Wei [1 ]
Zheng, Lianyu [1 ]
Xu, Jiaxing [1 ]
机构
[1] Beihang Univ, Sch Mech Engn & Automat, Beijing 100191, Peoples R China
关键词
Motion capture; Optical inertial fusion; Assembly planning; Human-centered assembly; Digital factory; SIMULATION; VISION;
D O I
10.1007/s00170-017-0526-4
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In assembly activities, performing assembly planning is a crucial issue for the human-centered manufacturing. The challenges involve in retrieving and utilizing the real-time data about human-based work activities in a shop floor. Instead of the simulation-based assembly planning, the marker-based motion capture system can acquire realistic motion data of workers in assembling sites, but this method is inclined to corrupt due to the occlusion and is troublesome to be installed within the shop floor. Therefore, based on the complement of the optical and inertial sensor, this paper presents a self-contained motion capture method for assembly planning in real shop floor. It can provide a real-time and portable motion capturing for workers, avoiding the failure of traditional outside-in motion capture system due to occlusions or incorrect installations. What is more, the portable motion capture method can run on consumer mobile devices, providing a convenient and low-cost way to perceive the workers' motion in shop floor, which is significant for extensive applications in assembly verification and planning for digital factories. Finally, experiments are carried out to demonstrate the accuracy and feasibility of the proposed motion capture method for assembly activities.
引用
收藏
页码:1243 / 1256
页数:14
相关论文
共 32 条
[21]  
Puthenveetil SC, 2014, PROCEEDINGS OF THE ASME INTERNATIONAL DESIGN ENGINEERING TECHNICAL CONFERENCES AND COMPUTERS AND INFORMATION IN ENGINEERING CONFERENCE, 2013, VOL 2B
[22]   Computer-automated ergonomic analysis based on motion capture and assembly simulation [J].
Puthenveetil, Sajeev C. ;
Daphalapurkar, Chinmay P. ;
Zhu, Wenjuan ;
Leu, Ming C. ;
Liu, Xiaoqing F. ;
Gilpin-Mcminn, Julie K. ;
Snodgrass, Scott D. .
VIRTUAL REALITY, 2015, 19 (02) :119-128
[23]  
Rublee E, 2011, IEEE I CONF COMP VIS, P2564, DOI 10.1109/ICCV.2011.6126544
[24]  
Siegwart R., 2011, Introduction toAutonomous Mobile Robots, V2nd
[25]  
Tedaldi D, 2014, IEEE INT CONF ROBOT, P3042, DOI 10.1109/ICRA.2014.6907297
[26]  
Wagner D, 2008, INT SYM MIX AUGMENT, P125, DOI 10.1109/ISMAR.2008.4637338
[27]   Real-virtual components interaction for assembly simulation and planning [J].
Wang, X. ;
Ong, S. K. ;
Nee, A. Y. C. .
ROBOTICS AND COMPUTER-INTEGRATED MANUFACTURING, 2016, 41 :102-114
[28]  
Wang Y, 2016, INT J ADV MANUF TECH, V89, P1699
[29]   Experimental comparison of dynamic tracking performance of iGPS and laser tracker [J].
Wang, Zheng ;
Mastrogiacomo, Luca ;
Franceschini, Fiorenzo ;
Maropoulos, Paul .
INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGY, 2011, 56 (1-4) :205-213
[30]   Monocular Vision for Long-term Micro Aerial Vehicle State Estimation: A Compendium [J].
Weiss, Stephan ;
Achtelik, Markus W. ;
Lynen, Simon ;
Achtelik, Michael C. ;
Kneip, Laurent ;
Chli, Margarita ;
Siegwart, Roland .
JOURNAL OF FIELD ROBOTICS, 2013, 30 (05) :803-831