MUSEFood: Multi-sensor-based Food Volume Estimation on Smartphones

被引:21
作者
Gao, Junyi [1 ,3 ,4 ]
Tan, Weihao [1 ,4 ]
Ma, Liantao [1 ,4 ]
Wang, Yasha [1 ,2 ,3 ]
Tang, Wen [5 ]
机构
[1] Minist Educ, Key Lab High Confidence Software Technol, Beijing 100871, Peoples R China
[2] Peking Univ, Natl Engn Res Ctr Software Engn, Beijing 100871, Peoples R China
[3] Peking Univ, Informat Technol Inst Tianjin Binhai, Tianjin 300450, Peoples R China
[4] Peking Univ, Sch Elect Engn & Comp Sci, Beijing 100871, Peoples R China
[5] Peking Univ Third Hosp, Dept Nephrol, Beijing 100191, Peoples R China
来源
2019 IEEE SMARTWORLD, UBIQUITOUS INTELLIGENCE & COMPUTING, ADVANCED & TRUSTED COMPUTING, SCALABLE COMPUTING & COMMUNICATIONS, CLOUD & BIG DATA COMPUTING, INTERNET OF PEOPLE AND SMART CITY INNOVATION (SMARTWORLD/SCALCOM/UIC/ATC/CBDCOM/IOP/SCI 2019) | 2019年
基金
中国国家自然科学基金;
关键词
food volume estimation; diet management; image segmentation; smartphone sensing;
D O I
10.1109/SmartWorld-UIC-ATC-SCALCOM-IOP-SCI.2019.00182
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Researches have shown that diet recording can help people increase awareness of food intake and improve nutrition management, and thereby maintain a healthier life. Recently, researchers have been working on smartphone-based diet recording methods and applications that help users accomplish two tasks: record what they eat and how much they eat. Although the former task has made great progress through adopting image recognition technology, it is still a challenge to estimate the volume of foods accurately and conveniently. In this paper, we propose a novel method, named MUSEFood, for food volume estimation. MUSEFood uses the camera to capture photos of the food, but unlike existing volume measurement methods, MUSEFood requires neither training images with volume information nor placing a reference object of known size while taking photos. In addition, considering the impact of different containers on the contour shape of foods, MUSEFood uses a multi-task learning framework to improve the accuracy of food segmentation, and uses a differential model applicable for various containers to further reduce the negative impact of container differences on volume estimation accuracy. Furthermore, MUSEFood uses the microphone and the speaker to accurately measure the vertical distance from the camera to the food in a noisy environment, thus scaling the size of food in the image to its actual size. The experiments on real foods indicate that MUSEFood outperforms state-of-the-art approaches, and highly improves the speed of food volume estimation.
引用
收藏
页码:899 / 906
页数:8
相关论文
共 28 条
[1]  
Akpa Elder Akpro Hippocrate, 2017, SICE Journal of Control, Measurement, and System Integration, V10, P360
[2]   A Multimedia Database for Automatic Meal Assessment Systems [J].
Allegra, Dario ;
Anthimopoulos, Marios ;
Dehais, Joachim ;
Lu, Ya ;
Stanco, Filippo ;
Farinella, Giovanni Maria ;
Mougiakakou, Stavroula .
NEW TRENDS IN IMAGE ANALYSIS AND PROCESSING - ICIAP 2017, 2017, 10590 :471-478
[3]  
Almaghrabi R, 2012, IEEE IMTC P, P366
[4]  
Ando Y., 2012, CONCERT HALL ACOUSTI, V17
[5]   Deep-based Ingredient Recognition for Cooking Recipe Retrieval [J].
Chen, Jingjing ;
Ngo, Chong-Wah .
MM'16: PROCEEDINGS OF THE 2016 ACM MULTIMEDIA CONFERENCE, 2016, :32-41
[6]   A glasses-type wearable device for monitoring the patterns of food intake and facial activity [J].
Chung, Jungman ;
Chung, Jungmin ;
Oh, Wonjun ;
Yoo, Yongkyu ;
Lee, Won Gu ;
Bang, Hyunwoo .
SCIENTIFIC REPORTS, 2017, 7
[7]   Food Recognition: A New Dataset, Experiments, and Results [J].
Ciocca, Gianluigi ;
Napoletano, Paolo ;
Schettini, Raimondo .
IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2017, 21 (03) :588-598
[8]  
Coughlin Steven S, 2015, Jacobs J Food Nutr, V2, P021
[9]   Two-View 3D Reconstruction for Food Volume Estimation [J].
Dehais, Joachim ;
Anthimopoulos, Marios ;
Shevchik, Sergey ;
Mougiakakou, Stavroula .
IEEE TRANSACTIONS ON MULTIMEDIA, 2017, 19 (05) :1090-1099
[10]  
Deng J, 2009, PROC CVPR IEEE, P248, DOI 10.1109/CVPRW.2009.5206848