Development and Validation of an Objective, Passive Dietary Assessment Method for Estimating Food and Nutrient Intake in Households in Low- and Middle-Income Countries: A Study Protocol

Modou L Jobarteh, Megan A McCrory, Benny Lo, Mingui Sun, Edward Sazonov, Alex K Anderson, Wenyan Jia, Kathryn Maitland, Jianing Qiu, Matilda Steiner-Asiedu, Janine A Higgins, Tom Baranowski, Peter Olupot-Olupot, Gary Frost, Modou L Jobarteh, Megan A McCrory, Benny Lo, Mingui Sun, Edward Sazonov, Alex K Anderson, Wenyan Jia, Kathryn Maitland, Jianing Qiu, Matilda Steiner-Asiedu, Janine A Higgins, Tom Baranowski, Peter Olupot-Olupot, Gary Frost

Abstract

Malnutrition is a major concern in low- and middle-income countries (LMIC), but the full extent of nutritional deficiencies remains unknown largely due to lack of accurate assessment methods. This study seeks to develop and validate an objective, passive method of estimating food and nutrient intake in households in Ghana and Uganda. Household members (including under-5s and adolescents) are assigned a wearable camera device to capture images of their food intake during waking hours. Using custom software, images captured are then used to estimate an individual's food and nutrient (i.e., protein, fat, carbohydrate, energy, and micronutrients) intake. Passive food image capture and assessment provides an objective measure of food and nutrient intake in real time, minimizing some of the limitations associated with self-reported dietary intake methods. Its use in LMIC could potentially increase the understanding of a population's nutritional status, and the contribution of household food intake to the malnutrition burden. This project is registered at clinicaltrials.gov (NCT03723460).

Keywords: assessment; camera; devices; dietary; food; household; intake; nutrient; undernutrition; wearable.

Copyright © The Author(s) 2020.

Figures

FIGURE 1
FIGURE 1
Wearable camera devices used for passive capture of images of food intake and food-related activities in the households: (A) Foodcam, (B) Automatic Ingestion Monitor (AIM), (C) Ear-worn, (D) eButton, and (E) eHAT.
FIGURE 2
FIGURE 2
A schematic diagram of the study plan. Wearable and fixed camera devices will be used to collect images of food intake and related activities such as cooking in households in Ghana and Uganda. Images captured by the devices will be stored in cloud storage, and the stored food images will be used for food recognition and estimation of portion size and nutrient content, thus providing objective, passive dietary assessment.
FIGURE 3
FIGURE 3
The Automatic Ingestion Monitor (AIM) software—an image annotation and dietary intake assessment software. AIM distinguishes between eating (blue bars, bottom left) and noneating episodes. Clicking on the blue bar displays images from the detected eating event. The images can be browsed individually and magnified to provide a good view of all the foods on the image. The software supports the incorporation of standard food composition databases. The software also supports visual estimation of portion sizes.

References

    1. Black RE, Victora CG, Walker SP, Bhutta ZA, Christian P, de Onis M, Ezzati M, Grantham-McGregor S, Katz J, Martorell R et al. .. Maternal and child undernutrition and overweight in low-income and middle-income countries. Lancet. 2013;382:427–51.
    1. Osgood-Zimmerman A, Millear AI, Stubbs RW, Shields C, Pickering BV, Earl L, Graetz N, Kinyoki DK, Ray SE, Bhatt S et al. .. Mapping child growth failure in Africa between 2000 and 2015. Nature. 2018;555:41.
    1. Patton GC, Olsson CA, Skirbekk V, Saffery R, Wlodek ME, Azzopardi PS, Stonawski M, Rasmussen B, Spry E, Francis K et al. .. Adolescence and the next generation. Nature. 2018;554:458.
    1. Popkin BM, Corvalan C, Grummer-Strawn LM. Dynamics of the double burden of malnutrition and the changing nutrition reality. Lancet. 2020;395:65–74.
    1. Black RE, Allen LH, Bhutta ZA, Caulfield LE, de Onis M, Ezzati M, Mathers C, Rivera J. Maternal and child undernutrition: global and regional exposures and health consequences. Lancet. 2008;371:243–60.
    1. Goris AHC, Westerterp-Plantenga MS, Westerterp KR. Undereating and underrecording of habitual food intake in obese men: selective underreporting of fat intake. Am J Clin Nutr. 2000;71:130–4.
    1. Greenberg AS, Vinken AG, Bathalon GP, Tucker KL, McCrory MA, Hays NP, Roberts SB. Psychological measures of eating behavior and the accuracy of 3 common dietary assessment methods in healthy postmenopausal women. Am J Clin Nutr. 2000;71:739–45.
    1. Bradley J, Simpson E, Poliakov I, Matthews JNS, Olivier P, Adamson AJ, Foster E. Comparison of INTAKE24 (an online 24-h dietary recall tool) with interviewer-led 24-h recall in 11–24 year-old. Nutrients. 2016;8:358.
    1. Carter M, Albar S, Morris M, Mulla U, Hancock N, Evans C, Alwan N, Greenwood D, Hardie L, Frost G et al. .. Development of a UK online 24-h dietary assessment tool: myfood24. Nutrients. 2015;7:4016.
    1. Wark PA, Hardie LJ, Frost GS, Alwan NA, Carter M, Elliott P, Ford HE, Hancock N, Morris MA, Mulla UZ et al. .. Validity of an online 24-h recall tool (myfood24) for dietary assessment in population studies: comparison with biomarkers and standard interviews. BMC Med. 2018;16:136.
    1. Liu B, Young H, Crowe FL, Benson VS, Spencer EA, Key TJ, Appleby PN, Beral V. Development and evaluation of the Oxford WebQ, a low-cost, web-based method for assessment of previous 24 h dietary intakes in large-scale prospective studies. Public Health Nutr. 2011;14:1998–2005.
    1. Subar AF, Kirkpatrick SI, Mittl B, Zimmerman TP, Thompson FE, Bingley C, Willis G, Islam NG, Baranowski T, McNutt S et al. .. The automated self-administered 24-hour dietary recall (ASA24): a resource for researchers, clinicians, and educators from the National Cancer Institute. J Acad Nutr Diet. 2012;112:1134–7.
    1. Slimani N, Deharveng G, Charrondière RU, van Kappel AL, Ocké MC, Welch A, Lagiou A, van Liere M, Agudo A, Pala V et al. .. Structure of the standardized computerized 24-h diet recall interview used as reference method in the 22 centers participating in the EPIC project. Comput Methods Programs Biomed. 1999;58:251–66.
    1. Slimani N, Casagrande C, Nicolas G, Freisling H, Huybrechts I, Ocké MC, Niekerk EM, van Rossum C, Bellemans M, De Maeyer M et al. .. The standardized computerized 24-h dietary recall method EPIC-Soft adapted for pan-European dietary monitoring. Eur J Clin Nutr. 2011;65:S5.
    1. Lichtman SW, Pisarska K, Berman ER, Pestone M, Dowling H, Offenbacher E, Weisel H, Heshka S, Matthews DE, Heymsfield SB. Discrepancy between self-reported and actual caloric intake and exercise in obese subjects. N Engl J Med. 1992;327:1893–8.
    1. Hill RJ, Davies PSW. The validity of self-reported energy intake as determined using the doubly labelled water technique. Br J Nutr. 2001;85:415–30.
    1. Park Y, Midthune D, Bowles H, Dodd KW, Kipnis V, Subar AF, Thompson FE, Troiano RP, Potischman N, Schoeller DA et al. .. Comparison of self-reported dietary intakes from the Automated Self-Administered 24-h recall, 4-d food records, and food-frequency questionnaires against recovery biomarkers. Am J Clin Nutr. 2018;107:80–93.
    1. Gemming L, Utter J, Ni Mhurchu C. Image-assisted dietary assessment: a systematic review of the evidence. J Acad Nutr Diet. 2015;115:64–77.
    1. Boushey CJ, Spoden M, Zhu FM, Delp EJ, Kerr DA. New mobile methods for dietary assessment: review of image-assisted and image-based dietary assessment methods. Proc Nutr Soc. 2017;76:283–94.
    1. Sharp DB, Allman-Farinelli M. Feasibility and validity of mobile phones to assess dietary intake. Nutrition. 2014;30:1257–66.
    1. Ambrosini GL, Hurworth M, Giglia R, Trapp G, Strauss P. Feasibility of a commercial smartphone application for dietary assessment in epidemiological research and comparison with 24-h dietary recalls. Nutr J. 2018;17:5.
    1. Farooq M, Sazonov E. A novel wearable device for food intake and physical activity recognition. Sensors. 2016;16:1067.
    1. Sun M, Burke LE, Baranowski T, Fernstrom JD, Zhang H, Chen H-C, Bai Y, Li Y, Li C, Yue Y et al. .. An exploratory study on a chest-worn computer for evaluation of diet, physical activity and lifestyle. J Healthc Eng. 2015;6(1):1–22.
    1. Burrows T, Collins C, Adam M, Duncanson K, Rollo M. Dietary assessment of shared plate eating: a missing link. Nutrients. 2019;11:789.
    1. LeCun Y, Bengio Y, Hinton G. Deep learning. Nature. 2015;521:436.
    1. Scrimshaw NS. INFOODS: the international network of food data systems. Am J Clin Nutr. 1997;65:1190S–3S.
    1. Ruth Charrondière U, Stadlmayr B, Rittenschober D, Mouille B, Nilsson E, Medhammar E, Olango T, Eisenwagen S, Persijn D, Ebanks K et al. .. FAO/INFOODS food composition database for biodiversity. Food Chem. 2013;140:408–12.
    1. Safaee-Rad R, Tchoukanov I, Smith KC, Benhabib B Three-dimensional location estimation of circular features for machine vision. IEEE Trans Robot Autom. 1992;8:624–40.
    1. Jia W, Yue Y, Fernstrom JD, Yao N, Sclabassi RJ, Fernstrom MH, Sun M. Image-based estimation of food volume using circular referents in dietary assessment. J Food Eng. 2012;109:76–86.
    1. Jia W, Chen HC, Yue Y, Li Z, Fernstrom J, Bai Y, Li C, Sun M. Accuracy of food portion size estimation from digital pictures acquired by a chest-worn camera. Public Health Nutr. 2014;17:1671–81.
    1. Stadlmayr B, Charrondière UR, Burlingame B. Development of a regional food composition table for West Africa. Food Chem. 2013;140:443–6.
    1. Murugu KD, Vincent A, Kimani A, Mbelenga E, Mwai J. Kenyan Food Recipes. 2018; FAO/Government of Kenya [Internet]. [Cited 2019 Oct 15]. Available from: .
    1. McCance and Widdowson's composition of foods integrated dataset (CoFID). Public Health England. [Internet]. Updated March 25, 2019. [Cited 2019 Oct 15]. Available from: .
    1. USDA Food composition databases. [Internet]. [Cited Oct 2019]. Available from: .
    1. Qiu J, Lo FP-W, Lo B. Assessing individual dietary intake in food sharing scenarios with a 360 camera and deep learning. In: 2019 16th International Conference on Wearable and Implantable Body Sensor Networks (BSN); Chicago (IL): IEEE; 2019.
    1. Jia W, Li Y, Qu R, Baranowski T, Burke LE, Zhang H, Bai Y, Mancino JM, Xu G, Mao ZH, Sun M. Automatic food detection in egocentric images using artificial intelligence technology. Public Health Nutr. 2018;22(7):1168–79.
    1. Lin L, Hedayat AS, Wu W. A unified approach for assessing agreement for continuous and categorical data. J Biopharm Stat. 2007;17:629–52.

Source: PubMed

3
Předplatit