A Machine Learning Model for Toothbrush Position Tracking using a Low-cost 6-axis IMU Sensor

被引:0
作者
Kwon M. [1 ]
Lim D. [2 ]
Kim D. [3 ]
Ryu S. [4 ]
Jo E. [4 ]
Kim Y.W. [4 ]
Kim J.H. [1 ]
机构
[1] Dept. of AI Convergence Engineering(BK21), Gyeongsang National University
[2] Information & Communication Engineering, Gyeongsang National University
[3] Dept. of Intelligence and Communication Engineering, Gyeongsang National University
[4] PAIST (ProxiHealthcare Advanced Institute for Science and Technology), Seoul
基金
新加坡国家研究基金会;
关键词
6-axis IMU sensor; Classification; Data analysis; Machine learning; Quaternion;
D O I
10.5370/KIEE.2024.73.2.358
中图分类号
学科分类号
摘要
The recent epidemic of respiratory diseases has underscored the importance of personal oral health care. Oral diseases, primarily caused by viral infections, can be reduced by regularly eliminating oral microorganisms. Effective tooth brushing is fundamental to oral health, but changing established brushing habits can be challenging. Adherence to recommended brushing techniques is challenging across all age groups, including children, older people, and adults. This study uses data from a low-cost, 6-axis IMU sensor and a machine learning-based classification algorithm for 13 brushing positions. We evaluate eight machine learning models using the sensor’s acceleration and angular velocity data and assess their performance using various metrics. Our results show that these models can classify brush positions with approximately 89% accuracy. This method enables monitoring of brushing areas and analysis of brushing patterns to improve brushing quality and adherence to recommended techniques. Consequently, by improving brushing quality, it is possible to maintain primary personal oral care and prevent various diseases. © 2024 Korean Institute of Electrical Engineers. All rights reserved.
引用
收藏
页码:358 / 367
页数:9
相关论文
共 28 条
[1]  
Lee J.-Y., Microbiological understanding of oral diseases, The Journal of the Korean dental association, 32, 9, pp. 630-637, (1994)
[2]  
Kelly N., Winning L., Irwin C., Lundy F. T., Linden D., McGarvey L., Linden G. J., El Karim I. A., Periodontal status and chronic obstructive pulmonary disease (copd) ex- acerbations: a systematic review, BMC oral health, 21, pp. 1-11, (2021)
[3]  
Lu L., Zhang J., Xie Y., Gao F., Xu S., Wu X., Ye Z., Et al., Wearable health devices in health care: narrative system- atic review, JMIR mHealth and uHealth, 8, 11, (2020)
[4]  
Sigcha L., Borz L., Amato F., Rechichi I., Ramos-Romero C., Cardenas A., Gasco L., Olmo G., Deep learn- ing and wearable sensors for the diagnosis and monitoring of parkinson’s disease: A systematic review, Expert Systems with Applications, (2023)
[5]  
Cook D. J., Strickland M., Schmitter-Edgecombe M., Detecting smartwatch-based behavior change in response to a multi-domain brain health intervention, ACM Transactions on Computing for Healthcare (HEALTH), 3, 3, pp. 1-18, (2022)
[6]  
Luo C., Feng X., Chen J., Li J., Xu W., Li W., Zhang L., Tari Z., Zomaya A. Y., Brush like a dentist: Accurate monitoring of toothbrushing via wrist-worn gesture sens- ing, IEEE INFOCOM 2019-IEEE Conference on Com- puter Communications, pp. 1234-1242, (2019)
[7]  
Ko Yumi, Kim JeongGyu, Kim HanKyul, Choi Ahyoung, AI smart brushing application with IOT and big data analysis, Proceedings of the Korean Information Science Society Conference, pp. 1695-1697, (2022)
[8]  
Lee K.-H., Lee J.-W., Kim K.-S., Kim D.-J., Kim K., Yang H.-K., Jeong K., Lee B., Tooth brushing pattern clas- sification using three-axis accelerometer and magnetic sen- sor for smart toothbrush, 2007 29th Annual International Conference of the IEEE Engineering in Medicine and Biol- ogy Society, pp. 4211-4214, (2007)
[9]  
Kim K.-S., Yoon T.-H., Lee J.-W., Kim D.-J., Interactive toothbrushing education by a smart toothbrush system via 3d visualization, Computer methods and programs in biomedicine, 96, 2, pp. 125-132, (2009)
[10]  
Marcon M., Sarti A., Tubaro S., Toothbrush motion analysis to help children learn proper tooth brushing, Com- puter Vision and Image Understanding, 148, pp. 34-45, (2016)