Gesture recognition and information recommendation based on machine learning and virtual reality in distance education

被引:10
作者
Juan, Wan [1 ]
机构
[1] Xian Peihua Univ, Sch Intelligence Sci & Informat Engn, Xian, Shaanxi, Peoples R China
关键词
Machine learning; virtual reality; distance education; gesture features; feature recognition; BIG DATA; ANALYTICS;
D O I
10.3233/JIFS-189572
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The dynamic and static gesture recognition in the distance education application scenario is not mature enough in theory at present, and still has a large space for development, and the application of gesture recognition in education is relatively insufficient. The purpose of this article is to combine gesture recognition with teacher classroom education and introduce a dynamic gesture recognition method. Moreover, this study introduces the data collection and preprocessing in detail and converts the data of the gesture action area into gray value images, and then uses the improved algorithm to perform classification. In addition, this study designs a control experiment to analyze the performance of the algorithm in this study and compares the accuracy of algorithm recognition from the perspective of simple background and complex background. The research results show that teaching gesture recognition in distance education can effectively improve education efficiency, with high accuracy, and can be directly applied to the system.
引用
收藏
页码:7509 / 7519
页数:11
相关论文
共 29 条
[1]   Three dimensional motion capture applied to violin playing: A study on feasibility and characterization of the motor strategy [J].
Ancillao, Andrea ;
Savastano, Bernardo ;
Galli, Manuela ;
Albertini, Giorgio .
COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE, 2017, 149 :19-27
[2]   IMPULSIVE CONTROLABILITY OF TUMOR GROWTH [J].
Antonov, Andrey ;
Nenov, Svetoslav ;
Tsvetkov, Tsvetelin .
DYNAMIC SYSTEMS AND APPLICATIONS, 2019, 28 (01) :93-109
[3]  
Chen ZW., 2015, INT J MULTIMEDIA UBI, V10, P17, DOI [10.14257/ijmue.2015.10.9.03, DOI 10.14257/ijmue.2015.10.9.03, DOI 10.14257/IJMUE.2015.10.9.03]
[4]  
Cheng Yu., 2020, DYNAM SYST APPL, V29, P683
[5]  
Chu, 2016, OPTICS PHOTONICS J, V06, P155, DOI [10.4236/opj.2016.68B026, DOI 10.4236/opj.2016.68B026, 10.4236/opj.2016.68b026, DOI 10.4236/OPJ.2016.68B026]
[6]   Can the Functional Movement Screen™ be used to capture changes in spine and knee motion control following 12 weeks of training? [J].
Frost, David M. ;
Beach, Tyson A. C. ;
Campbell, Troy L. ;
Callaghan, Jack P. ;
McGill, Stuart M. .
PHYSICAL THERAPY IN SPORT, 2017, 23 :50-57
[7]  
Fry AC., 2016, Big Data Analytics, V1, P11, DOI DOI 10.1186/S41044-016-0008-Y
[8]   In vivo Bone Position Measurement Using High-Frequency Ultrasound Validated with 3-D Optical Motion Capture Systems: A Feasibility Study [J].
Giannetti, Romano ;
Petrella, Anthony ;
Bach, Joel ;
Silverman, Anne K. ;
Saenz-Nuno, M. A. ;
Perez-Mallada, N. .
JOURNAL OF MEDICAL AND BIOLOGICAL ENGINEERING, 2017, 37 (04) :519-526
[9]  
Hu, 2014, TEXTILES LIGHT IND S, V3, P57, DOI [10.14355/tlist.2014.03.009, DOI 10.14355/TLIST.2014.03.009]
[10]   Motion Capture Depends Upon the Common Fate Factor Among Elements [J].
Ichikawa, Makoto ;
Masakura, Yuko .
PERCEPTION, 2017, 46 (12) :1371-1385