MAMAS: Supporting Parent - Child Mealtime Interactions Using Automated Tracking and Speech Recognition

被引:6
作者
Jo E. [1 ]
Bang H. [2 ]
Ryu M. [3 ]
Sung E.J. [1 ]
Leem S. [1 ]
Hong H. [1 ]
机构
[1] Seoul National University, Seoul
[2] Yonsei University, Seoul
[3] Purdue University, West Lafayette
基金
新加坡国家研究基金会;
关键词
children; eating behavior; family mealtime; magnetometer; parent - child interaction; semi-automated tracking; speech recognition;
D O I
10.1145/3392876
中图分类号
学科分类号
摘要
Many parents of young children find it challenging to deal with their children's eating problems, and parent - child mealtime interaction is fundamental in forming children's healthy eating habits. In this paper, we present the results of a three-week study through which we deployed a mealtime assistant application, MAMAS, for monitoring parent - child mealtime conversation and food intake with 15 parent - child pairs. Our findings indicate that the use of MAMAS helped 1) increase children's autonomy during mealtime, 2) enhance parents' self-awareness of their words and behaviors, 3) promote the parent - child relationship, and 4) positively influence the mealtime experiences of the entire family. The study also revealed some challenges in eating behavior interventions due to the complex dynamics of childhood eating problems. Based on the findings, we discuss how a mealtime assistant application can be better designed for parents and children with challenging eating behaviors. © 2020 ACM.
引用
收藏
相关论文
共 102 条
[1]  
Abidin R., Flens J.R., Austin W.G., The parenting stress index, Forensic Uses of Clinical Assessment Instruments, pp. 297-328, (2013)
[2]  
Achenbach T., Rescorla L., Manual for the ASEBA Preschool forms & profiles, Burlington: University of Vermont, Research Center for Children, Youth & Families, (2000)
[3]  
Ahearn W.H., Castine T., Nault K., Green G., An assessment of food acceptance in children with autism or pervasive developmental disorder-not otherwise specified, Journal of Autism and Developmental Disorders, 31, 5, pp. 505-511, (2001)
[4]  
Albinali F., Goodwin M.S., Intille S.S., Recognizing Stereotypical Motor Movements in the Laboratory and Classroom, (2009)
[5]  
Amft O., Junker H., Troster G., Detection of eating and drinking arm gestures using inertial body-worn sensors, Proceedings-International Symposium on Wearable Computers, ISWC, 2005, pp. 160-163, (2005)
[6]  
Amft O., Kusserow M., Troster G., Bite weight prediction from acoustic recognition of chewing, IEEE Transactions on Biomedical Engineering, 56, 6, pp. 1663-1672, (2009)
[7]  
Amft O., Troster G., On-body sensing solutions for automatic dietary monitoring, IEEE Pervasive Computing, 8, 2, pp. 62-70, (2009)
[8]  
Ayobi A., Sonne T., Marshall P., Cox A.L., Interaction Centre U.C.L., Flexible and mindful self-tracking: Design implications from paper bullet journals, Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems-CHI'18, (2018)
[9]  
Bachour K., Kaplan F., Dillenbourg P., An interactive table for supporting participation balance in face-to-face collaborative learning, IEEE Transactions on Learning Technologies, 3, 3, pp. 203-213, (2010)
[10]  
Ben-Sasson A., Ben-Sasson E., Jacobs K., Saig E., Baby CROINC: An online, crowd-based, expert-curated system for monitoring child development, Proceedings of the 11th EAI International Conference on Pervasive Computing Technologies for Healthcare (PervasiveHealth '17). ACM, pp. 110-119, (2017)