Transfer learning and its extensive appositeness in human activity recognition: A survey

被引:16
作者
Ray, Abhisek [1 ]
Kolekar, Maheshkumar H. [2 ]
机构
[1] Indian Inst Technol Patna, Dept Elect Engn, Video Surveillance Lab, Bihta 801103, Bihar, India
[2] Indian Inst Technol Patna, Dept Elect Engn, Bihta 801103, Bihar, India
关键词
Deep learning; Human activity recognition; Machine learning; Transfer learning; Cross-domain transfer; UNSUPERVISED DOMAIN ADAPTATION; UNLABELED DATA; CLASSIFICATION; SYSTEMS; ALGORITHMS; QUALITY; MOBILE;
D O I
10.1016/j.eswa.2023.122538
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this competitive world, the supervision and monitoring of human resources are primary and necessary tasks to drive context-aware applications. Advancement in sensor and computational technology has cleared the path for automatic human activity recognition (HAR). First, machine learning and later deep learning play a cardinal role in this automation process. Classical machine learning approaches follow the hypothesis that the training, validation, and testing data belong to the same domain, where data distribution characteristics and the input feature space are alike. However, during real-time HAR, the above hypothesis does not always true. Transfer learning helps in an extended manner to transfer the required knowledge among heterogeneous data of various activities. To display the hierarchical advancements in transfer learning-enhanced HAR, we have shortlisted the 150 most influential works and articles from 2014-2021 based on their contribution, citation score, and year of publication. These selected articles are collected from IEEE Xplore, Web of Science, and Google Scholar digital libraries. We have also analyzed the statistical research interest related to this topic to substantiate the significance of our survey. We have found a significant growth of 10% in research publications related to this domain every year. Our survey provides a unique classification model to delineate the diversity in transfer learning-based HAR. This survey delves into the world of HAR datasets, exploring their types, specifications, advantages, and limitations. We also examine the steps involved in HAR, including the various transfer learning techniques and performance metrics, as well as the computational complexity associated with these methods. Additionally, we identify the challenges and gaps in HAR related to transfer learning and provide insights into future directions for researchers in this field. Based on the survey findings, researchers prefer the inductive transfer method, feature learning transfer mode, and cross-action transfer domain more over others due to their superior performance, with respective popularity scores of 55%, 40.8%, and 50.2%. This review aims to equip readers with a comprehensive understanding of HAR and transfer learning mechanisms, while also highlighting areas that require further research.
引用
收藏
页数:32
相关论文
共 270 条
[31]   Cross-position activity recognition with stratified transfer learning [J].
Chen, Yiqiang ;
Wang, Jindong ;
Huang, Meiyu ;
Yu, Han .
PERVASIVE AND MOBILE COMPUTING, 2019, 57 :1-13
[32]  
Cheng ZW, 2012, LECT NOTES COMPUT SC, V7584, P52, DOI 10.1007/978-3-642-33868-7_6
[33]   HAA500: Human-Centric Atomic Action Dataset with Curated Videos [J].
Chung, Jihoon ;
Wuu, Cheng-Hsin ;
Yang, Hsuan-Ru ;
Tai, Yu-Wing ;
Tang, Chi-Keung .
2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, :13445-13454
[34]   Transfer learning for activity recognition: a survey [J].
Cook, Diane ;
Feuz, Kyle D. ;
Krishnan, Narayanan C. .
KNOWLEDGE AND INFORMATION SYSTEMS, 2013, 36 (03) :537-556
[35]   Domain-Specific Priors and Meta Learning for Few-Shot First-Person Action Recognition [J].
Coskun, Huseyin ;
Zia, M. Zeeshan ;
Tekin, Bugra ;
Bogo, Federica ;
Navab, Nassir ;
Tombari, Federico ;
Sawhney, Harpreet S. .
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (06) :6659-6673
[36]   Automated Deception Detection of Males and Females From Non-Verbal Facial Micro-Gestures [J].
Crockett, Keeley ;
O'Shea, James ;
Khan, Wasiq .
2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
[37]   Sensor-based and vision-based human activity recognition: A comprehensive survey [J].
Dang, L. Minh ;
Min, Kyungbok ;
Wang, Hanxiang ;
Piran, Md. Jalil ;
Lee, Cheol Hee ;
Moon, Hyeonjoon .
PATTERN RECOGNITION, 2020, 108
[38]   A Thousand Frames in Just a Few Words: Lingual Description of Videos through Latent Topics and Sparse Object Stitching [J].
Das, Pradipto ;
Xu, Chenliang ;
Doell, Richard F. ;
Corso, Jason J. .
2013 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2013, :2634-2641
[39]   Audio-based Activities of Daily Living (ADL) recognition with large-scale acoustic embeddings from online videos [J].
Liang, Dawei ;
Thomaz, Edison .
Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 2019, 3 (01)
[40]  
de Souza CR, 2017, Procedural generation of videos to train deep action recognition networks