Learning Human Activity From Visual Data Using Deep Learning

被引:5
作者
Alhersh, Taha [1 ]
Stuckenschmidt, Heiner [1 ]
Rehman, Atiq Ur [2 ]
Belhaouari, Samir Brahim [2 ]
机构
[1] Univ Mannheim, Data & Web Sci Grp, D-68131 Mannheim, Germany
[2] Hamad Bin Khalifa Univ, Coll Sci & Engn, ICT Div, Doha, Qatar
关键词
Sensors; Visualization; Activity recognition; Feature extraction; Cameras; Optical sensors; Optical network units; Human activity recognition; deep learning; first-person vision; HUMAN ACTIVITY RECOGNITION;
D O I
10.1109/ACCESS.2021.3099567
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Advances in wearable technologies have the ability to revolutionize and improve people's lives. The gains go beyond the personal sphere, encompassing business and, by extension, the global economy. The technologies are incorporated in electronic devices that collect data from consumers' bodies and their immediate environment. Human activities recognition, which involves the use of various body sensors and modalities either separately or simultaneously, is one of the most important areas of wearable technology development. In real-life scenarios, the number of sensors deployed is dictated by practical and financial considerations. In the research for this article, we reviewed our earlier efforts and have accordingly reduced the number of required sensors, limiting ourselves to first-person vision data for activities recognition. Nonetheless, our results beat state of the art by more than 4% of F1 score.
引用
收藏
页码:106245 / 106253
页数:9
相关论文
共 50 条
[31]   An ensemble deep learning model for human activity analysis using wearable sensory data [J].
Batool, Sheeza ;
Khan, Muhammad Hassan ;
Farid, Muhammad Shahid .
APPLIED SOFT COMPUTING, 2024, 159
[32]   Assessment of IMU Configurations for Human Activity Recognition Using Ensemble Learning [J].
Murphy, Samuel J. ;
Vitali, Rachel V. .
IEEE ACCESS, 2024, 12 :111433-111442
[33]   Human Activity Recognition Based on Acceleration Data From Smartphones Using HMMs [J].
Iloga, Sylvain ;
Bordat, Alexandre ;
Le Kernec, Julien ;
Romain, Olivier .
IEEE ACCESS, 2021, 9 :139336-139351
[34]   A Deep Learning Approach for Human Activities Recognition From Multimodal Sensing Devices [J].
Ihianle, Isibor Kennedy ;
Nwajana, Augustine O. ;
Ebenuwa, Solomon Henry ;
Otuka, Richard, I ;
Owa, Kayode ;
Orisatoki, Mobolaji O. .
IEEE ACCESS, 2020, 8 :179028-179038
[35]   Human activity recognition by combining external features with accelerometer sensor data using deep learning network model [J].
Neeraj Varshney ;
Brijesh Bakariya ;
Alok Kumar Singh Kushwaha ;
Manish Khare .
Multimedia Tools and Applications, 2022, 81 :34633-34652
[36]   Human activity recognition using deep transfer learning of cross position sensor based on vertical distribution of data [J].
Neeraj Varshney ;
Brijesh Bakariya ;
Alok Kumar Singh Kushwaha .
Multimedia Tools and Applications, 2022, 81 :22307-22322
[37]   Human activity recognition by combining external features with accelerometer sensor data using deep learning network model [J].
Varshney, Neeraj ;
Bakariya, Brijesh ;
Kushwaha, Alok Kumar Singh ;
Khare, Manish .
MULTIMEDIA TOOLS AND APPLICATIONS, 2022, 81 (24) :34633-34652
[38]   Human activity recognition using deep transfer learning of cross position sensor based on vertical distribution of data [J].
Varshney, Neeraj ;
Bakariya, Brijesh ;
Kushwaha, Alok Kumar Singh .
MULTIMEDIA TOOLS AND APPLICATIONS, 2022, 81 (16) :22307-22322
[39]   A Recent Survey for Human Activity Recoginition Based on Deep Learning Approach [J].
Dhillon, Jagwinder Kaur ;
Chandni ;
Kushwaha, Alok Kumar Singh .
2017 FOURTH INTERNATIONAL CONFERENCE ON IMAGE INFORMATION PROCESSING (ICIIP), 2017, :223-228
[40]   A Comparative Analysis of Hybrid Deep Learning Models for Human Activity Recognition [J].
Abbaspour, Saedeh ;
Fotouhi, Faranak ;
Sedaghatbaf, Ali ;
Fotouhi, Hossein ;
Vahabi, Maryam ;
Linden, Maria .
SENSORS, 2020, 20 (19) :1-14