Recognizing Grabbing Actions from Inertial and Video Sensor Data in a Warehouse Scenario

被引:7
作者
Diete, Alexander [1 ]
Sztyler, Timo [1 ]
Weiland, Lydia [1 ]
Stuckenschmidt, Heiner [1 ]
机构
[1] Univ Mannheim, B6 26, D-68159 Mannheim, Germany
来源
14TH INTERNATIONAL CONFERENCE ON MOBILE SYSTEMS AND PERVASIVE COMPUTING (MOBISPC 2017) / 12TH INTERNATIONAL CONFERENCE ON FUTURE NETWORKS AND COMMUNICATIONS (FNC 2017) / AFFILIATED WORKSHOPS | 2017年 / 110卷
关键词
machine learning; sensor fusion; action recognition; ORDER PICKING; DESIGN;
D O I
10.1016/j.procs.2017.06.071
中图分类号
TN [电子技术、通信技术];
学科分类号
0809 ;
摘要
Modern industries are increasingly adapting to smart devices for aiding and improving their productivity and work flow. This includes logistics in warehouses where validation of correct items per order can be enhanced with mobile devices. Since handling incorrect orders is a big part of the costs of warehouse maintenance, reducing errors like missed or wrong items should be avoided. Thus, early identification of picking procedures and items picked is beneficial for reducing these errors. By using data glasses and a smartwatch we aim to reduce these errors while also enabling the picker to work hands-free. In this paper, we present an analysis of feature sets for classification of grabbing actions in the order picking process. For this purpose, we created a dataset containing inertial data and egocentric video from four participants performing picking tasks, modeled closely to a real-world warehouse environment. We extract features from the time and frequency domain for inertial data and color and descriptor features from the image data to learn grabbing actions. By using three different supervised learning approaches on inertial and video data, we are able to recognize grabbing actions in a picking scenario. We show that the combination of both video and inertial sensors yields a F-measure of 85.3% for recognizing grabbing actions. (c) 2017 The Authors. Published by Elsevier B.V.
引用
收藏
页码:16 / 23
页数:8
相关论文
共 21 条
[1]  
[Anonymous], 2008, Guide to the carnegie mellon university multimodal activity (cmu-mmac) database
[2]  
[Anonymous], SMART OBJ SYST TECHN
[3]  
[Anonymous], P AUSTR C ROB AUT
[4]  
[Anonymous], ORDER PICKING 21 CEN
[5]  
Chen C, 2015, IEEE IMAGE PROC, P168, DOI 10.1109/ICIP.2015.7350781
[6]   Histograms of oriented gradients for human detection [J].
Dalal, N ;
Triggs, B .
2005 IEEE COMPUTER SOCIETY CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, VOL 1, PROCEEDINGS, 2005, :886-893
[7]   Design and control of warehouse order picking: A literature review [J].
de Koster, Rene ;
Le-Duc, Tho ;
Roodbergen, Kees Jan .
EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2007, 182 (02) :481-501
[8]  
Diete Alexander, 2017, 2017 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops), P111, DOI 10.1109/PERCOMW.2017.7917542
[9]   BORIS: a free, versatile open-source event-logging software for video/audio coding and live observations [J].
Friard, Olivier ;
Gamba, Marco .
METHODS IN ECOLOGY AND EVOLUTION, 2016, 7 (11) :1325-1330
[10]   Pick from Here! - An Interactive Mobile Cart using In-Situ Projection for Order Picking [J].
Funk, Markus ;
Shirazi, Alireza Sahami ;
Mayer, Sven ;
Lischke, Lars ;
Schmidt, Albrecht .
PROCEEDINGS OF THE 2015 ACM INTERNATIONAL JOINT CONFERENCE ON PERVASIVE AND UBIQUITOUS COMPUTING (UBICOMP 2015), 2015, :601-609