AI-powered trustable and explainable fall detection system using transfer learning

被引:2
作者
Patel, Aryan Nikul [1 ]
Murugan, Ramalingam [2 ]
Maddikunta, Praveen Kumar Reddy [2 ]
Yenduri, Gokul [3 ]
Jhaveri, Rutvij H. [4 ]
Zhu, Yaodong [5 ]
Gadekallu, Thippa Reddy [6 ,7 ,8 ]
机构
[1] Vellore Inst Technol, Sch Comp Sci & Engn, Vellore, India
[2] Vellore Inst Technol, Sch Comp Sci Engn & Informat Syst, Vellore, India
[3] VIT AP Univ, Sch Comp Sci & Engn, Amaravati 522237, Andhra Pradesh, India
[4] Pandit Deendayal Energy Univ, Sch Technol, Gandhinagar, Gujarat, India
[5] Jiaxing Univ, Sch Informat Sci & Engn, Jiaxing 314001, Peoples R China
[6] Zhejiang A&F Univ, Coll Math & Comp Sci, Hangzhou 311300, Peoples R China
[7] Lovely Profess Univ, Div Res & Dev, Phagwara, India
[8] Chitkara Univ, Inst Engn & Technol, Ctr Res Impact & Outcome, Rajpura 140401, Punjab, India
关键词
Artificial intelligence; Explainable artificial intelligence; Transfer learning; Deep neural networks; Fall detection; WEARABLE SENSORS; RECOGNITION; MACHINE;
D O I
10.1016/j.imavis.2024.105164
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Accidental falls pose a significant public health challenge, especially among vulnerable populations. To address this issue, comprehensive research on fall detection and rescue systems is essential. Vision-based technologies, with their promising potential, offer an effective means to detect falls. This research paper presents a cuttingedge fall detection methodology aimed at enhancing individual safety and well-being. The proposed methodology utilizes deep neural networks, leveraging their capabilities to drive advancements in fall detection. To overcome data limitations and computational efficiency concerns, this study employ transfer learning by finetuning pre-trained models on large-scale image datasets for fall detection. This approach significantly enhances model performance, enabling better generalization and accuracy, especially in real-time applications with constrained resources. Notably, the methodology achieved an impressive test accuracy of 98.15%. Additionally, the incorporation of Explainable Artificial Intelligence (XAI) techniques is used to ensure transparent and trustworthy decision-making in fall detection using deep learning models, especially in critical healthcare contexts for vulnerable individuals. XAI provides valuable insights into complex model architectures and parameters, enabling a deeper understanding of fall identification patterns. To evaluate the effectiveness of this approach, a rigorous experimentation was conducted using a diverse dataset containing real-world fall and nonfall scenarios. The results demonstrate substantial improvements in both accuracy and interpretability, confirming the superiority of this method over conventional fall detection approaches.
引用
收藏
页数:19
相关论文
共 50 条
  • [31] Personalized cancer vaccine design using AI-powered technologies
    Kumar, Anant
    Dixit, Shriniket
    Srinivasan, Kathiravan
    Dinakaran, M.
    Vincent, P. M. Durai Raj
    FRONTIERS IN IMMUNOLOGY, 2024, 15
  • [32] Ensemble transfer learning meets explainable AI: A deep learning approach for leaf disease detection
    Raval, Hetarth
    Chaki, Jyotismita
    ECOLOGICAL INFORMATICS, 2024, 84
  • [33] Prototype of AI-powered assistance system for digitalisation of manual waste sorting
    Aberger, J.
    Shami, S.
    Haecker, B.
    Pestana, J.
    Khodier, K.
    Sarc, R.
    WASTE MANAGEMENT, 2025, 194 : 366 - 378
  • [34] Optimized Tiny Machine Learning and Explainable AI for Trustable and Energy-Efficient Fog-Enabled Healthcare Decision Support System
    Arthi, R.
    Krishnaveni, S.
    INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE SYSTEMS, 2024, 17 (01)
  • [35] An Explainable AI Paradigm for Alzheimer's Diagnosis Using Deep Transfer Learning
    Mahmud, Tanjim
    Barua, Koushick
    Habiba, Sultana Umme
    Sharmen, Nahed
    Hossain, Mohammad Shahadat
    Andersson, Karl
    DIAGNOSTICS, 2024, 14 (03)
  • [36] Understanding Polymers Through Transfer Learning and Explainable AI
    Miccio, Luis A.
    APPLIED SCIENCES-BASEL, 2024, 14 (22):
  • [37] AI-Powered Penetration Testing using Shennina: From Simulation to Validation
    Karagiannis, Stylianos
    Fusco, Camilla
    Agathos, Leonidas
    Mallouli, Wissam
    Casola, Valentina
    Ntantogian, Christoforos
    Magkos, Emmanouil
    19TH INTERNATIONAL CONFERENCE ON AVAILABILITY, RELIABILITY, AND SECURITY, ARES 2024, 2024,
  • [38] AI-powered Fraud Detection in Decentralized Finance: A Project Life Cycle Perspective
    Luo, Bingqiao
    Zhang, Zhen
    Wang, Qian
    Ke, Anli
    Lu, Shengliang
    He, Bingsheng
    ACM COMPUTING SURVEYS, 2025, 57 (04)
  • [39] Complex business ecosystem intelligence using AI-powered visual analytics
    Basole, Rahul C.
    Park, Hyunwoo
    Seuss, David
    DECISION SUPPORT SYSTEMS, 2024, 178
  • [40] Explainable AI-Powered IoT Systems for Predictive and Preventive Healthcare - A Framework for Personalized Health Management and Wellness Optimization
    Kumbhar, Uddhav T.
    Phursule, Rajesh
    Patil, V. C.
    Moje, Ravindra K.
    Shete, Omkar R.
    Tayal, Madhuri A.
    JOURNAL OF ELECTRICAL SYSTEMS, 2023, 19 (03) : 23 - 31