Artificial intelligence for food-related emotions analysis – aiMotions
Emotions constitute an essential factor that significantly influences our behaviors. Eating behavior is no exception and the relationships between emotions and food are of great interest to the scientific community. Emotional states influence eating behavior including motivation to eat, food choice, or amount of food intake. However, understanding why and how emotions have a positive or negative impact on eating behavior remains a vast topic of debate. Links between food and emotional states can be interpreted from several perspectives, ranging from physiological to behavioral changes. Therefore, quantitative and qualitative analyses of these changes are of primary importance for the validity of the conclusions of emotion studies in Food Science. So, to get deeper insights into consumer behavior, diverse instruments to capture emotional reactions have emerged over the last years at explicit (emotional lexicon, self-reported questionnaire) and implicit (physiological measures, brain activity, facial expressions) levels. Ongoing technological advances of wearable devices for heart rate, eye tracking, or EEG combined with the increased quality of video recordings makes emotion capture easier. However, despite great potential, such sensors are still too invasive for daily-life situations of food consumption. So, measuring emotions in an ecological context remains a real challenge for fundamental and applied research. This can explain why most studies in Food Science continue to collect emotional feedback using only self-reported questionnaires, even if combining explicit and implicit measures offers best opportunities to capture complementary information about the food-related experience.
In this context, the main scientific objective of aiMotions is to decipher how food elicits emotions to better understand the mechanisms underlying food choice behavior, thanks to an unprecedented joint contribution of Artificial Intelligence, Computer Vision, and Affective Computing to Food Science. To achieve success, aiMotions addresses the challenge to develop a new AI-based framework to capture and analyze emotional responses elicited by food. We hypothesize that (H1) Cameras can replace wearable sensors (heart rate, pupillometry, facial expressions) and reliably capture emotional responses without being intrusive; (H2) Computer Vision and Affective Computing boosted with Artificial Intelligence can accurately extract implicit emotional cues and analyze their evolution over time; (H3) The joint use of this new AI-based framework and other traditional explicit/implicit measures can pave the way for unprecedented studies and insights in Food Sciences. aiMotions targets 3 scientific objectives: (O1) Release an open multimodal dataset of emotional responses elicited by images or odors, (O2) Design an end-to-end AI-based framework that targets automated analysis of emotions and derive a lightweight easy-to-use version compatible with a personal computer, (O3) Conduct a series of disruptive studies in Food Sciences implementing this AI-based framework possibly coupled with physiological or behavioral measures to gather more precise information on food-elicited emotions and better understand the underlying mechanisms.
aiMotions will provide solid scientific knowledge highlighting 1) an improved characterization of the dynamics of food-elicited emotions, in terms of time (occurrence and duration), nature, and intensity (valence and arousal); 2) the evidence of the validity of video-based methods over other implicit/explicit measures, and 3) a broader understanding of the relationships between cognition and emotion perception.
We firmly believe there is still room for meaningful improvements in the analysis of emotions in the context of food consumption. The exploitation of digital technologies offers new perspectives on this topic and paves the way for a ground-breaking AI-based framework that will offer new insights for Food Sciences studies.
Project coordination
Dominique GINHAC (LABORATOIRE INTERDISCIPLINAIRE CARNOT DE BOURGOGNE - UMR 6303)
The author of this summary is the project coordinator, who is responsible for the content of this summary. The ANR declines any responsibility as for its contents.
Partner
CSGA CENTRE DES SCIENCES DU GOUT ET DE L'ALIMENTATION - UMR 6265 - UMR A1324 - uB 80
ICB LABORATOIRE INTERDISCIPLINAIRE CARNOT DE BOURGOGNE - UMR 6303
IETR Institut d'Electronique et des Technologies du numéRique (IETR)
Help of the ANR 553,793 euros
Beginning and duration of the scientific project:
December 2023
- 48 Months