Role of multisensory integration in facial emotion understanding; developmental and neuropsychological approaches – INTERFACE
This project will include three main tasks. The first task will hold on healthy adults. The main objective is to understand in a finer way how and at which level olfaction affects cognitive processes. The second task will include infant’s studies. The purpose here will be to see whether infants can associate odour-induced feelings and facial expressions and, in case of positive answer to this first question, at which age. Finally, in the third task, we will study schizophrenic patients, who are known to have both a deficit in facial expression recognition and a deficit in expressiveness (i.e., flat affect). This approach, coupling cognitive psychology, developmental psychology, ethology and neuropsychology, will allow a better apprehension of processes going on during inter-sensory integration and facial emotion recognition/understanding by confronting human cognitive and emotional processes at different stages: either mature, under acquisition, or altered.
The main results, at this time, is in infancy. Recognition of emotional facial expressions in a multi-sensory way is a crucial skill for adaptive behavior and investigating its cognitive development is still a topic which receives a lot of attention. At 5- to 7- months, infants look longer to a dynamic angry/happy face, which emotionally matches a vocal expression, indicating that they are able to match stimulation conveyed by distinct modalities on their emotional content. Such abilities have been showed as early as 3.5 months with highly familiar stimuli (e.g. the mother’s face and voice). As the emotional prosody, odors are also good candidates to convey emotional information, based on their hedonic value, to arouse emotions/feelings and to elicit specific facial expression. In the first study, olfaction-vision matching abilities was assessed across different age groups (3-mo, 5-mo and 7-months), using dynamic expressive faces (happy vs. disgusted) and distinct hedonic olfactory contexts (pleasant, unpleasant and control) through the visual-preference paradigm. Results indicated shifting patterns between 3 and 7 months. Interaction between odor and face emotion was revealed for the 3-months group only: they were capable of matching odor and expressive face according to the emotional content. Effect of face expression was found in the 5-mo group with a preference for disgust faces than happy faces, whereas no more effects were found for the eldest group.
For task 2 (multisensory integration and the development of facial expression recognition in infancy), the main objective will be to investigate further how, and from which age, the olfactory context influences the recognition of facial emotions. In an experiment , we are studying whether the olfactory context elicits an infant’s expectations for specific facial actions in neutral static expressions. In another experiment, we are studying whether infants are able to integrate the emotional states of their mother from stress-elicited body odours, and if they express expectations for specific facial information.
In task 3 (multisensory integration and the deficit of facial expression recognition in schizophrenia), the perspective will be to assess whether the deficit of facial expressiveness in schizophrenia may influence both the categorical perception of facial expression and the contextual effect of olfactory stimulations. More generally, the hypothesis of the involvement of a mirror system supposed that their lack of facial expressiveness will decrease their facial reactions to odors and, consequently, lower the potential influence of this context. These patterns should be correlated with the deficit in facial expressiveness. However, one may also predict that schizophrenic patients will exhibit an enhanced reactivity to the olfactory context, exhibiting a specific pattern of biases compared to healthy controls; in a previous study, we have shown that categorical boundaries of facial expression are less well defined in schizophrenic patients than in controls, with increased number of intrusions in the patients (i.e., the schizophrenic patients more frequently perceived an expression that was not used in the continuum; Vernet, Baudouin, & Franck, 2008). The higher sensitivity to intrusions in schizophrenic patients may favor the emergence of effects from the olfactory context. Thus, the aim of task 3 will be to test further these hypotheses.
Godard, O., Baudouin, J.-Y., Martin, S., Schaal, B., & Durand, K.
Poster à l’International Congress of Psychology (ICP 2012). Hedonic matching of Faces and Odors : Already effective in infants aged 3, 5 and 7 months?
-Leleu, A., Demily, C., Franck, N., Durand, K., Baudouin, J.-Y.,& Schaal, B.. Communication affiché à l’ “International Symposium Vision, action and concepts: Behavioural and neural basis of embodied perception and cognition (Lilles, octobre 2013). Matching emotional expressions of faces within an olfactory context: Does my own feeling matter ?
The general objective of the project is to investigate the role of multisensory integration in the ability to recognize and understand facial emotion, by considering both developmental aspects in infancy and neuropsychological aspects in schizophrenia. The main hypothesis is that multisensory integration boosts the ability to recognize facial emotion and its development by the way of matching between own facial reactions to multisensory stimulation and facial expression of conspecifics. One question is how and when such a matching mechanism might influence facial expression processing. This issue will be addressed in studying healthy adults in using both behavioral and electrophysiological approaches. We will also investigate the role of matching self-generated expressions and expressions of others by studying patients with a deficit in both aspects of expressiveness and recognition of expressions; namely, schizophrenic patients. The second hypothesis is that non-visual sensory modalities may shape and constrain the matching process. Each modality has its own sensory processes and hedonic properties, and positive/negative feelings arise from these specific sensations. A direct consequence is that no modality will integrate the emotional environment in the way that corresponds to facial emotion categories (e.g., happiness, anger, disgust, fear, and sadness), at least before the learning of associations. Thus, multisensory integration not only promotes coupling processes between self-generated expressions and expressions from others, but they can also shape them according to the characteristics of every modality, and according to their development. We will test this hypothesis in infants.
This project will include three main tasks. The first task will hold on healthy adults. The main objective will be to understand whether, how, and when the olfactory context influences facial emotion recognition. In experiment 1, we will study the influence of the olfactory context on facial emotion categories to assess whether the boundaries of these categories are flexible as a function of the emotional meaning of the odour. In experiment 2, we will study the potential early influence of the olfactory context on visual attention, using a visual search paradigm. Finally, in experiment 3, we will study further at which (temporal) level the olfactory context is integrated, using ERPs recordings. The second task will include infant’s studies. The purpose here will be to see whether infants can associate odour-induced feelings and facial expressions and, in case of positive answer to this first question, at which age. In experiment 4, we will study infants’ matching abilities between an odour and dynamic facial expressions. In experiment 5, we will study whether the olfactory context elicits an infant’s expectations for specific facial actions in neutral static expressions. Finally, in experiment 6, we will study whether infants are able to integrate the emotional states of their mother from stress-elicited body odours, and if they express expectations for specific facial information. Finally, in the third task, we will study schizophrenic patients, who are known to have both a deficit in facial expression recognition and a deficit in expressiveness (i.e., flat affect). The purpose of this task will be to assess whether the deficit of facial expressiveness in schizophrenia may influence both the categorical perception of facial expression and the contextual effect of olfactory stimulations. To test for this hypothesis, we will adapt experiment 1 and experiment 3 (task 1) to the study of schizophrenic patients.
This approach, coupling cognitive psychology, developmental psychology, ethology and neuropsychology, will allow a better apprehension of processes going on during multisensory integration and facial emotion recognition/understanding by confronting human cognitive and emotional processes at different stages: either mature, under acquisition, or altered.
Project coordination
Jean-Yves Baudouin (Centre des Sciences du Goût et de l'Alimentation) – Jean-Yves.Baudouin@u-bourgogne.fr
The author of this summary is the project coordinator, who is responsible for the content of this summary. The ANR declines any responsibility as for its contents.
Partner
CSGA Centre des Sciences du Goût et de l'Alimentation
CNRS - CNC Centre de Neurosciences Cognitives
Help of the ANR 220,000 euros
Beginning and duration of the scientific project:
October 2011
- 36 Months