Intelligent Assistive Technology and Systems Lab - click to go to homepage
IATSL develops assistive technology that is adaptive, flexible, and intelligent, enabling users to participate fully in their daily lives. Learn more about our research

Visit us:

Room 438

500 University Ave.

Toronto, Canada

P 416.946.8573

F 416.946.8570


Send us mail:

160 - 500 University Ave.

Toronto, ON, M5G 1V7



email us!


Follow IATSL on Twitter


Intelligent Nutritional Assessment system (INA)

Keywords: Computer vision, machine learning, nutrition, older adults, dementia.

Overview of Research

Undernutrition is a global problem in elderly populations. The WHO states that good nutritional status is an important determinant of quality of life and that many of the diseases from which older adults suffer result from dietary factors compounded with changes that naturally happen with ageing [1]. Fortunately, diet and other environmental factors that affect morbidity and mortality are modifiable, and disease prevention or delay are possible through proper nutrition.

To shed light on disease causation, reliable and accurate nutritional data is essential, especially in free-living older populations. However, current dietary assessment methods are labour-intensive, time-consuming and inaccurate as they rely on self-reporting. Moreover, older adults suffering from dementia are particularly vulnerable to malnutrition and would not be able to report their own food intake consistently due to memory problems. Technology has the potential of assisting in nutritional analysis by alleviating the cognitive load of recording food intake and lessening the burden of care for the elderly. Our work is focused on this goal

We propose an Intelligent Nutritional Assessment system (INA) that would monitor the dietary patterns of older adults with dementia at their own homes. The goal of the system is to automatically recognize the food that is on the person’s plate, estimate the portion size and nutritional value of the consumed meal, and provide active feedback and prompts to the user (Figure 1).

Our system consists of a web camera placed above a plate, a kitchen scale and a computer with a monitor to run the algorithms we developed. In its current state, the system assesses nutrition by measuring dietary intake from the images of one’s meals. Starting with a food image database and an image of a meal, the system would proceed to analyze it by (1) segmenting and (2) recognizing the food items, and (3) estimating the portion size of each one. Once the food labels and portion sizes are known, the nutritional intake of the meal will be displayed to the user by indexing into a country’s nutritional database and extracting the relevant information.

Future Work

At this stage, the system has focused on the food recognition and portion estimation components which were tested in a controlled laboratory environment. The system’s current user is a cognitively healthy individual. In the future, we would expand INA to accommodate people suffering from dementia by shifting from user-initiated interaction to continuous monitoring via a video stream. Thus, INA would detect upcoming eating events and prompt the user accordingly.

Moreover, the most important extension is the personalization of INA’s behaviour to its user. The system would learn the eating patterns of its user and adapt its estimations to the particular food items and portions eaten by the individual. User feedback would be required for the learning phase since even a large food database could not encompass the endless variety of foods and their preparations.

Images of annotated food for INA project
Figure 1: Examples of images from a food image dataset that we created. The images were annotated by humans to identify the plate and food segments. INA's accuracy is calculated by comparing its guesses against these images (click to enlarge Fig. 1).


  1. W. H. Organization, “Nutrition for older persons. Ageing and nutrition: a growing global challenge,” August 2011.


Canadian Institute of Health Research (CIHR)

Research Team

Yulia Eskin, University of Toronto

Alex Mihailidis, University of Toronto

Arlene Astell, University of St. Andrews