The Middletown Press (Middletown, CT)
Diet assessment is reimagined
Diet is a leading cause of premature mortality and the miseries of antecedent morbidity in the United States and an increasing contributor to the global burden of disease as well. There is ever more focus on this matter in the scientific literature, including calls for more routine attention to diet in clinical practice.
In the modern era, however, much suggests that better training of people to do better training of their clients/patients is at best a partial solution to intractable problems. Lessons in better driving for example, however appropriate, have done far less to reduce highway fatalities than engineered solutions, from airbags to safety cages, anti-lock brakes to lane departure warnings. More people more reliably get just where they are going not because of better delivered directions, or any success in getting more guys to ask for any — but because of ubiquitous GPS systems. So, too, perhaps, for diet — where the challenge is much the same: getting from where you are, to where you want, and ought, to be.
As any of you who has ever completed a diet diary, or logged your foods knows — traditional methods of dietary intake assessment have important limitations, to say the least. They are generally tedious, labor-intensive, time-consuming, memory-dependent, analysis-dependent at the n-of-1 level (usually not your problem, but still — someone’s problem!), and often quite prone to inaccuracies in spite of all that.
Innovation to date, however, has been limited to what might be deemed tinkering at the margins. Traditionally pen-and-paper versions of dietary intake assessment tools have been transposed to the web. With cell phone cameras increasingly ubiquitous, much attention has shifted to capturing and uploading food photos as an alternative to stand-alone narrative descriptions.
Image capture of this kind suffers two limitations of its own, however. First, like all extant tools, this method must develop a representation of habitual dietary intake one eating occasion at a time. A critical mass of photos must be uploaded to represent the overall, usual dietary pattern, demanding a large number of interpretable food photos and dependence on their representative variety.
The other major problem with reliance on food and meal images in lieu of narrative descriptions is image recognition. Even the highly trained human eye may have difficulty differentiating a grilled cut of chicken, tuna or beef on a plate, to say nothing of far less transparent food assemblies, from soups to stews to salads to sandwiches to almost any prepared food or dish involving multiple ingredients.
Current dietary intake assessments all start with individual foods in an effort to assemble a representation of the habitual diet. Diet ID — an innovation on which colleagues and I have been working for the past several years, and now nearing commercial availability — does the converse. It begins with fully assembled dietary prototypes, compiled by a team of experts, based on the defining attributes of a given kind of diet; specific, objective measures of diet quality; and the translation of these two “coordinates” into the composite image of a multiday menu plan representing one particular version of habitual dietary intake.
By displaying such images and inviting the choice of the “best fit,” the method relies on pattern recognition, and does not depend at all on the recollection of details. To some extent, Diet ID even benefits from relative inattention to details, and a quick response to the “big picture” of diet, as per the argument made by Malcolm Gladwell in his book “Blink.”
Diet-quality photo navigation may be likened to lens selection by an optometrist or ophthalmologist. The identification of the “correct” lens for any given eye does not involve filling out a lengthy questionnaire addressing recollection of visual acuity under differing conditions of light, stress, fatigue and so on. Rather, the process begins with fully formed lenses, and the views through them. Your task is simply to differentiate clear from blurry images, and repeat the process until lenses as nearly optimal as possible are identified. Diet ID does exactly the same with diet.
Diet ID provides these four functions — find, choose, navigate, track — in diet “space,” just as GPS does in geographic space. Our aim is to deliver this functionality to you via apps, programs and websites you use already. We are delighted to introduce an innovation we are confident will make it a whole lot easier…to get there, from here. To learn more, visit www.dqpn.io.