A Human-Mimetic AI System for Automatic, Passive and Objective Dietary Assessment Unhealthy diet is strongly linked to risks of chronic diseases, such as cardiovascular diseases, diabetes and certain types of cancer. The Global Burden of Disease Study has found that, among the top 17 risk factors, poor diet is overwhelmingly the No. 1 risk factor for human diseases. Despite the strong connection between diet and health, unhealthy foods with large portion sizes are widely consumed. Currently, 68.5% of U.S. adults are overweight, among the highest in developed countries. The recent decline in U.S. life expectancy sent another alarming signal about the general health of the American people. Understanding how the diet-related risk factors affect people?s health and finding effective ways to empower them in improving lifestyle habits are among the most important tasks in public health. Unfortunately, dietary assessment in real-world settings has been exceedingly complex and inaccurate to implement. Technology is needed that allows researchers to assess dietary intake easily and accurately in real world settings so that effective intervention to manage obesity and related chronic diseases can be developed. We propose a biomedical engineering project to address the dietary assessment problem, taking advantage of advanced mathematical modeling, wearable electronics and artificial intelligence. Our research team has been improving the ability to assess diet for over a decade. We have designed the eButton, a small wearable device pinned on clothes in front of the chest, capable of collecting image-based dietary data objectively and passively (i.e., without depending on subject?s self-report or volitional operation of the device). We have also developed algorithms to compute food volumes and nutrients from images. Since the eButton was developed, it has been used by many researchers in the U.S. and other countries for objective and passive diet-intake studies in both adults and children. Despite the past successes, there have been two lingering critical problems associated with the objective and passive dietary assessment using wearable devices: 1) substantial manual efforts are required for researchers to visually examine image data to identify foods and estimate their volumes (portion sizes), and 2) there are privacy concerns about researchers? viewing of participants? real-life images. Although solving these problems could enable the eButton and other wearable devices for large-scale diet-intake studies, we were not able to find effective solutions until recently when Artificial intelligence (AI) emerged. Advanced AI systems, especially those based on deep learning, can be trained by large amounts of labeled data to produce results comparable or even superior to those produced by human in numerous fields of applications. AI technology is also a powerful tool for dietary assessment, potentially providing an ideal solution to the two previously mentioned problems. We thus propose to develop a human-mimetic AI system to recognize foods from images, estimate portion sizes, and find energy and nutrient values from a database in a fully automatic process. Using the AI approach, there will be no need for researchers to view participants? real-life images, and the AI system well-respects individuals? privacy because it is trained to recognizes human foods only, nothing else. Currently, the performances of existing AI systems are limited by the extensive variety and high variability of human foods, insufficient training data, and difficulty in finding appropriate nutritional information from food databases. In this application, we propose a new strategy to personalize the AI system for each research participant using an advanced mathematical model of personal food choices. With this personalization step, the dimensionality of our envisioned AI system can be reduced drastically, and our goal of automatic, objective and passive dietary assessment can be reached realistically. We also propose to improve the electronic hardware and develop a biomimetic camera to enlarge the field of view for the eButton. Finally, we will conduct a thorough evaluation of the personalized AI system in real-world settings using human subjects.

Agency
National Institute of Health (NIH)
Institute
National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK)
Type
Research Project (R01)
Project #
1R01DK127310-01
Application #
10111099
Study Section
Biomedical Computing and Health Informatics Study Section (BCHI)
Program Officer
Evans, Mary
Project Start
2021-01-01
Project End
2024-12-31
Budget Start
2021-01-01
Budget End
2021-12-31
Support Year
1
Fiscal Year
2021
Total Cost
Indirect Cost
Name
University of Pittsburgh
Department
Neurosurgery
Type
Schools of Medicine
DUNS #
004514360
City
Pittsburgh
State
PA
Country
United States
Zip Code
15213