Machine Learning Fusion Model Approach for the Real-Time Detection of Head Gestures using IMUs

被引:0
作者
Davila-Montero, Sylmarie [1 ]
Mason, Andrew J. [2 ]
机构
[1] Mil Coll SC, Citadel, Dept Elect & Comp Engn, Charleston, SC 29409 USA
[2] Michigan State Univ, Dept Elect & Comp Engn, E Lansing, MI 48824 USA
来源
SOUTHEASTCON 2024 | 2024年
关键词
fusion model; machine learning; real-time; inertial movement sensors; head gestures; healthy interactions; wearables; BEHAVIOR;
D O I
10.1109/SOUTHEASTCON52093.2024.10500094
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Modern sensor technology has contributed to the study of human behaviors during social interactions. Inertial movement units (IMUs) have shown great promise in the recognition of communication cues displayed by head gestures, which are important for healthy interactions. However, no gold standard exists to automatically detect head actions from IMUs. This paper presents the design of a real-time head-action detection (HAD) unit based on a new real-time fusion model architecture approach. An analysis of buffer sizes and feature contribution using a decision tree (DT) classifier and a predictor importance fusion is presented. The fusion model is composed of two classification stages wherein the first stage focus on recognizing head position and the second on recognizing head motion. The designed HAD unit uses a data buffer size of 3s, 7 features in total, and a DT classifier. Results show a testing accuracy of 97.91% and an Fl-score of 98.5% The use of the designed HAD unit and its architecture could allow for easy retraining to add recognition of additional head actions by having specialized head action classification models.
引用
收藏
页码:909 / 913
页数:5
相关论文
共 14 条
[1]  
Balti H, 2013, IEEE INT SYMP SIGNAL, P470, DOI 10.1109/ISSPIT.2013.6781926
[2]   Person Independent Recognition of Head Gestures from Parametrised and Raw Signals Recorded from Inertial Measurement Unit [J].
Borowska-Terka, Anna ;
Strumillo, Pawel .
APPLIED SCIENCES-BASEL, 2020, 10 (12)
[3]   Design of a Multi-Sensor Framework for the Real-time Monitoring of Social Interactions [J].
Davila-Montero, Sylmarie ;
Parsnejad, Sina ;
Ashoori, Ehsan ;
Goderis, Derek ;
Mason, Andrew J. .
2022 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS 22), 2022, :615-619
[4]   Review and Challenges of Technologies for Real-Time Human Behavior Monitoring [J].
Davila-Montero, Sylmarie ;
Dana-Le, Jocelyn Alisa ;
Bente, Gary ;
Hall, Angela T. ;
Mason, Andrew J. .
IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS, 2021, 15 (01) :2-28
[5]   A Multimodal Adaptive Wireless Control Interface for People With Upper-Body Disabilities [J].
Fall, Cheikh Latyr ;
Quevillon, Francis ;
Blouin, Martine ;
Latour, Simon ;
Campeau-Lecours, Alexandre ;
Gosselin, Clement ;
Gosselin, Benoit .
IEEE TRANSACTIONS ON BIOMEDICAL CIRCUITS AND SYSTEMS, 2018, 12 (03) :564-575
[6]  
Hernandez LM, 2006, GENES BEHAV SOCIAL E
[7]   Using Inertial Sensors to Determine Head Motion-A Review [J].
Ionut-Cristian, Severin ;
Dan-Marius, Dobrea .
JOURNAL OF IMAGING, 2021, 7 (12)
[8]   SEWA DB: A Rich Database for Audio-Visual Emotion and Sentiment Research in the Wild [J].
Kossaifi, Jean ;
Walecki, Robert ;
Panagakis, Yannis ;
Shen, Jie ;
Schmitt, Maximilian ;
Ringeval, Fabien ;
Han, Jing ;
Pandit, Vedhas ;
Toisoul, Antoine ;
Schuller, Bjorn ;
Star, Kam ;
Hajiyev, Elnar ;
Pantic, Maja .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2021, 43 (03) :1022-1040
[9]  
Sancheti K, 2018, TENCON IEEE REGION, P0356, DOI 10.1109/TENCON.2018.8650532
[10]  
Severin I., 2021, 2021 INT S SIGN CIRC, P3