Sun ASA24-Kids Accurate assessment of dietary intake is important for 1) evaluating the outcome of dietary change interventions, 2) assessing the relationship between diet and health outcomes (e.g., obesity, metabolic syndrome), 3) identifying factors (e.g. environmental, psychosocial) influencing dietary intake which could be targeted in change programs, and 4) assessing diet changes over time (called surveillance). Our research revealed 35% of the foods children reported consuming the previous day were imaginary (called intrusions: observers did not see those foods consumed) and 15% were forgotten (called omissions: foods observers recorded, but were not reported by the children). This level of error needs to be minimized to enhance children's dietary intake assessment accuracy. Pictures of actual meals served, and after consumption, should minimize intrusions and omissions. Problems with the smart phone method of taking meal pictures for children, however, are it requires remembering to carry and use the smart phone correctly at the right time, even in social circumstances (e.g., with certain friends) when child reticence, even embarrassment, may minimize some children's willingness to draw that kind of attention to themselves. Dr. Mingui Sun (U Pittsburgh) has developed a multisensory unit, the e.button, attached to the shirt, which includes a camera, battery and storage to record pictures of everything in front of a child at 2 to 10 second intervals throughout the day. The Sun System requires a trained research assistant to identify the foods, and use of an on screen malleable wire mesh feature to estimate amounts consumed (taking about 10 min/day of images), which are in turn verified by a nutrition/dietetics professional (taking about five min/day of images). Limits of this system are that many foods are hard to recognize in a photo (e.g., poor lighting, odd angle, unusual food, unusual preparation or presentation of the food). Similar problems will occur with an automated system using pattern recognition to identify the foods. The child who consumed the foods, alternatively, should be able to recognize an image of the foods without much difficulty. The ASA24-Kids, adapted to children's abilities from the Automated Self-Administered 24 hour recall (ASA24) for adults, enables children to self-report what they consumed the previous day. Combining the Sun System with ASA24-Kids may permit recording children's dietary intake that minimizes the limitations of child's 24 hour memory by providing images of the foods consumed, yet allows child review of foods in the images. While the feasibility of using an automated image capture method for dietary assessment has been established among adults, parallel work has not been done among children. As a result, we propose creating Sun-ASA24-Kids, adapting it to child abilities, and validating it in a field study. The Sun-ASA24-Kids system could minimize two major sources of child reporting error, and unclear picture errors.

Public Health Relevance

A passive diet assessment program (developed by Mingui Sun) that takes images at intervals of 2-10 seconds throughout the day should minimize common dietary recall memory errors among children. Children using ASA24-Kids, a 24 hour diet assessment program, to review and edit the automated identification of foods, and estimates of intake from the mealtime images, should minimize machine errors. This grant application proposes to combine the Sun sensor system with ASA24-Kids and conduct a field validation of estimates.

Agency
National Institute of Health (NIH)
Institute
National Cancer Institute (NCI)
Type
Exploratory/Developmental Grants (R21)
Project #
1R21CA172864-01A1
Application #
8582288
Study Section
Special Emphasis Panel (ZCA1-SRLB-B (M1))
Program Officer
Subar, Amy
Project Start
2013-09-18
Project End
2015-08-31
Budget Start
2013-09-18
Budget End
2014-08-31
Support Year
1
Fiscal Year
2013
Total Cost
$189,000
Indirect Cost
$46,210
Name
Baylor College of Medicine
Department
Pediatrics
Type
Schools of Medicine
DUNS #
051113330
City
Houston
State
TX
Country
United States
Zip Code
77030
Zhao, Qi; Zhang, Boxue; Wang, Jingjing et al. (2017) Improved method of step length estimation based on inverted pendulum model. Int J Distrib Sens Netw 13:
Lyons, Elizabeth J; Baranowski, Tom; Basen-Engquist, Karen M et al. (2016) Testing the effects of narrative and play on physical activity among breast cancer survivors using mobile apps: study protocol for a randomized controlled trial. BMC Cancer 16:202
Xin, Miao; Zhang, Hong; Wang, Helong et al. (2016) ARCH: Adaptive recurrent-convolutional hybrid networks for long-term action recognition. Neurocomputing 178:87-102
Baranowski, Tom (2015) Are active video games useful to combat obesity? Am J Clin Nutr 101:1107-8
Li, Zhen; Wei, Zhiqiang; Yue, Yaofeng et al. (2015) An adaptive Hidden Markov model for activity recognition based on a wearable multi-sensor device. J Med Syst 39:57
Zhang, Boxue; Zhao, Qi; Feng, Wenquan et al. (2015) SIFT-Based Indoor Localization for Older Adults Using Wearable Camera. Proc IEEE Annu Northeast Bioeng Conf 2015:
Sun, Mingui; Burke, Lora E; Baranowski, Thomas et al. (2015) An exploratory study on a chest-worn computer for evaluation of diet, physical activity and lifestyle. J Healthc Eng 6:1-22
Zhao, Qi; Wang, Jingjing; Feng, Wenquan et al. (2015) Assessing Physical Performance in Free-Living Older Adults with a Wearable Computer. Proc IEEE Annu Northeast Bioeng Conf 2015:
Li, Yuecheng; Jia, Wenyan; Yu, Tianjian et al. (2015) A Low Power, Parallel Wearable Multi-Sensor System for Human Activity Evaluation. Proc IEEE Annu Northeast Bioeng Conf 2015:
Cheng, Feiyang; Zhang, Hong; Sun, Mingui et al. (2015) Cross-trees, Edge and Superpixel Priors-based Cost aggregation for Stereo matching. Pattern Recognit 48:2269-2278

Showing the most recent 10 out of 16 publications