An efficient multi-modal sensors feature fusion approach for handwritten characters recognition using Shapley values and deep autoencoder

被引:1
作者
Singh, Shashank Kumar [1 ]
Chaturvedi, Amrita [1 ]
机构
[1] Indian Inst Technol BHU, Dept Comp Sci & Engn, Varanasi, India
关键词
Hand gesture recognition; Electromyography; Feature fusion; Inertial Measurement Unit; Supervised learning; FEATURE-EXTRACTION; CLASSIFICATION; REPRESENTATION; SELECTION; DEVICES; MODELS; ONLINE; FORCE; IMAGE;
D O I
10.1016/j.engappai.2024.109225
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Handwriting is essential for the development of fine motor skills in children. Handwritten character recognition has the potential to facilitate natural human-machine interactions, aiding in the digitization of handwritten text for educational environments such as smart classrooms. Electromyography (EMG), a widely recognized biosignal, captures complex electrical patterns generated by muscle activity during handwriting movements, offering detailed insights into neuromuscular function. This study proposes an efficient multi-modal handwritten character recognition pipeline integrating physiological (EMG) and Inertial Measurement Unit (IMU) sensors. EMG signals provide valuable information about muscle function and activation patterns, while IMU sensors track motion and orientation associated with handwriting. The proposed system employs feature fusion, combining data from both sensor types. A cooperative game theory-based feature ranking method and a modified deep auto-encoder architecture are utilized for enhanced data representation and feature extraction. A novel dataset comprising 26 isolated handwritten English alphabets written on a whiteboard was collected for experimental validation. The proposed pipeline demonstrates high efficiency, achieving a classification accuracy of 99.01% for the isolated handwritten characters. Additional performance metrics, including the Matthews correlation coefficient (98.77) and Kappa Score (98.97), were assessed to validate the model's effectiveness. The fusion of EMG and IMU data enhances system robustness, offering significant potential for digitizing handwritten notes in smart classrooms and for clinical handwriting analysis, including the diagnosis and monitoring of Alzheimer's disease
引用
收藏
页数:23
相关论文
共 148 条
[81]  
Mane S, 2021, Arxiv, DOI arXiv:2103.07110
[82]   From explanations to feature selection: assessing SHAP values as feature selection mechanism [J].
Marcilio Jr, Wilson E. ;
Eler, Danilo M. .
2020 33RD SIBGRAPI CONFERENCE ON GRAPHICS, PATTERNS AND IMAGES (SIBGRAPI 2020), 2020, :340-347
[83]   The IAM-database: An English sentence database for offline handwriting recognition [J].
U.-V. Marti ;
H. Bunke .
International Journal on Document Analysis and Recognition, 2002, 5 (1) :39-46
[84]   Recognition of offline handwritten Urdu characters using RNN and LSTM models [J].
Misgar, Muzafar Mehraj ;
Mushtaq, Faisel ;
Khurana, Surinder Singh ;
Kumar, Munish .
MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 82 (02) :2053-2076
[85]   Deep Architectures for Image Compression: A Critical Review [J].
Mishra, Dipti ;
Singh, Satish Kumar ;
Singh, Rajat Kumar .
SIGNAL PROCESSING, 2022, 191
[86]  
Morency L.P., 2017, P 55 ANN M ASS COMP, P3, DOI 10.18653/v1/P17-5002
[87]   UrduDeepNet: offline handwritten Urdu character recognition using deep neural network [J].
Mushtaq, Faisel ;
Misgar, Muzafar Mehraj ;
Kumar, Munish ;
Khurana, Surinder Singh .
NEURAL COMPUTING & APPLICATIONS, 2021, 33 (22) :15229-15252
[88]   A convolution deep architecture for gender classification of urdu handwritten characters [J].
Nabi, Syed Tufael ;
Kumar, Munish ;
Singh, Paramjeet .
MULTIMEDIA TOOLS AND APPLICATIONS, 2024, 83 (29) :72179-72194
[89]   DeepNetDevanagari: a deep learning model for Devanagari ancient character recognition [J].
Narang, Sonika Rani ;
Kumar, Munish ;
Jindal, M. K. .
MULTIMEDIA TOOLS AND APPLICATIONS, 2021, 80 (13) :20671-20686
[90]  
Ngiam J., 2011, P 28 INT C MACH LEAR, P689