Metrological Evaluation of Human-Robot Collaborative Environments Based on Optical Motion Capture Systems

被引:9
作者
Gonzalez, Leticia [1 ]
Alvarez, Juan C. [1 ]
Lopez, Antonio M. [1 ]
Alvarez, Diego [1 ]
机构
[1] Univ Oviedo, Dept Elect, Multisensor Syst & Robot Grp SiMuR, Elect Comp & Syst Engn, C Pedro Puig Adam, Gijon 33203, Spain
关键词
calibration; groupware; human-robot interaction; industrial robots; optical tracking; ACCURACY;
D O I
10.3390/s21113748
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
In the context of human-robot collaborative shared environments, there has been an increase in the use of optical motion capture (OMC) systems for human motion tracking. The accuracy and precision of OMC technology need to be assessed in order to ensure safe human-robot interactions, but the accuracy specifications provided by manufacturers are easily influenced by various factors affecting the measurements. This article describes a new methodology for the metrological evaluation of a human-robot collaborative environment based on optical motion capture (OMC) systems. Inspired by the ASTM E3064 test guide, and taking advantage of an existing industrial robot in the production cell, the system is evaluated for mean error, error spread, and repeatability. A detailed statistical study of the error distribution across the capture area is carried out, supported by a Mann-Whitney U-test for median comparisons. Based on the results, optimal capture areas for the use of the capture system are suggested. The results of the proposed method show that the metrological characteristics obtained are compatible and comparable in quality to other methods that do not require the intervention of an industrial robot.
引用
收藏
页数:16
相关论文
共 30 条
[1]   Validation of Thigh Angle Estimation Using Inertial Measurement Unit Data against Optical Motion Capture Systems [J].
Abhayasinghe, Nimsiri ;
Murray, Iain ;
Bidabadi, Shiva Sharif .
SENSORS, 2019, 19 (03)
[2]   Accuracy map of an optical motion capture system with 42 or 21 cameras in a large measurement volume [J].
Aurand, Alexander M. ;
Dufour, Jonathan S. ;
Marras, William S. .
JOURNAL OF BIOMECHANICS, 2017, 58 :237-240
[3]   Differences in gaze anticipation for locomotion with and without vision [J].
Authie, Colas N. ;
Hilt, Pauline M. ;
N'Guyen, Steve ;
Berthoz, Alain ;
Bennequin, Daniel .
FRONTIERS IN HUMAN NEUROSCIENCE, 2015, 9
[4]   Safety assurance mechanisms of collaborative robotic systems in manufacturing [J].
Bi, Z. M. ;
Luo, Chaomin ;
Miao, Zhonghua ;
Zhang, Bing ;
Zhang, W. J. ;
Wang, Lihui .
ROBOTICS AND COMPUTER-INTEGRATED MANUFACTURING, 2021, 67
[5]  
Bostelman R. V., 2016, AUTONOMOUS IND VEHIC, P1
[6]  
Bostelman Roger, 2017, J CMSC, V12, P314
[7]   Technical quality assessment of an optoelectronic system for movement analysis [J].
Di Marco, R. ;
Rossi, S. ;
Patane, F. ;
Cappa, P. .
2014 JOINT IMEKO TC1-TC7-TC13 SYMPOSIUM: MEASUREMENT SCIENCE BEHIND SAFETY AND SECURITY, 2015, 588
[8]   Effects of the calibration procedure on the metrological performances of stereophotogrammetric systems for human movement analysis [J].
Di Marco, Roberto ;
Rossi, Stefano ;
Castelli, Enrico ;
Patane, Fabrizio ;
Mazza, Claudia ;
Cappa, Paolo .
MEASUREMENT, 2017, 101 :265-271
[9]   Analysis of accuracy in optical motion capture - A protocol for laboratory setup evaluation [J].
Eichelberger, Patric ;
Ferraro, Matteo ;
Minder, Ursina ;
Denton, Trevor ;
Blasimann, Angela ;
Krause, Fabian ;
Baur, Heiner .
JOURNAL OF BIOMECHANICS, 2016, 49 (10) :2085-2088
[10]   Evaluation of Inertial Sensor Data by a Comparison with Optical Motion Capture Data of Guitar Strumming Gestures [J].
Freire, Sergio ;
Santos, Geise ;
Armondes, Augusto ;
Meneses, Eduardo A. L. ;
Wanderley, Marcelo M. .
SENSORS, 2020, 20 (19) :1-27