Automated Estimation of Food Type and Amount Consumed from Body-worn Audio and Motion Sensors

被引:66
作者
Mirtchouk, Mark [1 ]
Merck, Christopher [1 ]
Kleinberg, Samantha [1 ]
机构
[1] Stevens Inst Technol, Hoboken, NJ 07030 USA
来源
UBICOMP'16: PROCEEDINGS OF THE 2016 ACM INTERNATIONAL JOINT CONFERENCE ON PERVASIVE AND UBIQUITOUS COMPUTING | 2016年
基金
美国国家科学基金会;
关键词
Nutrition; Eating recognition; Acoustic and motion sensing;
D O I
10.1145/2971648.2971677
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Determining when an individual is eating can be useful for tracking behavior and identifying patterns, but to create nutrition logs automatically or provide real-time feedback to people with chronic disease, we need to identify both what they are consuming and in what quantity. However, food type and amount have mainly been estimated using image data (requiring user involvement) or acoustic sensors (tested with a restricted set of foods rather than representative meals). As a result, there is not yet a highly accurate automated nutrition monitoring method that can be used with a variety of foods. We propose that multi-modal sensing (in-ear audio plus head and wrist motion) can be used to more accurately classify food type, as audio and motion features provide complementary information. Further, we propose that knowing food type is critical for estimating amount consumed in combination with sensor data. To test this we use data from people wearing audio and motion sensors, with ground truth annotated from video and continuous scale data. With data from 40 unique foods we achieve a classification accuracy of 82.7% with a combination of sensors (versus 67.8% for audio alone and 76.2% for head and wrist motion). Weight estimation error was reduced from a baseline of 127.3% to 35.4% absolute relative error. Ultimately, our estimates of food type and amount can be linked to food databases to provide automated calorie estimates from continuously-collected data.
引用
收藏
页码:451 / 462
页数:12
相关论文
共 34 条
[1]   Detection of eating and drinking arm gestures using inertial body-worn sensors [J].
Amft, O ;
Junker, H ;
Tröster, G .
NINTH IEEE INTERNATIONAL SYMPOSIUM ON WEARABLE COMPUTERS, PROCEEDINGS, 2005, :160-163
[2]  
Amft O, 2005, LECT NOTES COMPUT SC, V3660, P56
[3]   Bite Weight Prediction From Acoustic Recognition of Chewing [J].
Amft, Oliver ;
Kusserow, Martin ;
Troester, Gerhard .
IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, 2009, 56 (06) :1663-1672
[4]   On-Body Sensing Solutions for Automatic Dietary Monitoring [J].
Amft, Oliver ;
Troester, Gerhard .
IEEE PERVASIVE COMPUTING, 2009, 8 (02) :62-70
[5]   Detecting Mastication - A Wearable Approach [J].
Bedri, Abdelkareem ;
Verlekar, Apoorva ;
Thomaz, Edison ;
Avva, Valerie ;
Starner, Thad .
ICMI'15: PROCEEDINGS OF THE 2015 ACM INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION, 2015, :247-250
[6]   Menu-Match: Restaurant-Specific Food Logging from Images [J].
Beijbom, Oscar ;
Joshi, Neel ;
Morris, Dan ;
Saponas, Scott ;
Khullar, Siddharth .
2015 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2015, :844-851
[7]   Leveraging Context to Support Automated Food Recognition in Restaurants [J].
Bettadapura, Vinay ;
Thomaz, Edison ;
Parnami, Aman ;
Abowd, Gregory D. ;
Essa, Irfan .
2015 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2015, :580-587
[8]  
Cheng J. etal., 2013, P 2013 ACM C PERV UB, P155, DOI [DOI 10.1145/2494091.2494143, 10.1145/2494091.2494143]
[9]  
Cheng JY, 2010, LECT NOTES COMPUT SC, V6030, P319, DOI 10.1007/978-3-642-12654-3_19
[10]   Detecting Periods of Eating During Free-Living by Tracking Wrist Motion [J].
Dong, Yujie ;
Scisco, Jenna ;
Wilson, Mike ;
Muth, Eric ;
Hoover, Adam .
IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2014, 18 (04) :1253-1260