DS0707 - Interactions humain-machine, objets connectés, contenus numériques, données massives et connaissance

Seconds that matter: Managing First Impressions for a more Engaging Virtual Agent – IMPRESSIONS

IMPRESSIONS

Seconds that matter: Managing First Impressions for a more Engaging Virtual Agent

Agent’s First Impressions Management of Warmth and Competence

In any encounter the first moments are critical and the impressions that we form of others matter. The IMPRESSIONS project aims at creating computational models of communicative behavior for virtual Embodied Conversational Agents (ECAs) in order to manage first impressions when interacting with their users. Those models include the capability of generating and exhibiting verbal and nonverbal behaviors (e.g. smile, gaze, gestures) adapted to the user's behavior as well as physiological state. The behavior computational models include adaptive generation of behaviors for attaining and managing first impressions of warmth and competence. Whereas non-intrusive techniques advancing the state-of-the-art in physiological measurements are being used to detect/infer the user's physiological state during the first moments of interaction with the ECA. The project outcomes, in addition to those models, will include results from controlled user studies and a collaboration with the Natural History Museum of Neuchatel in Switzerland for a real-life public deployment.

From the user’s perspective, the interaction will consist of a natural face-to-face communication with the agent, i.e. by using speech and gestures for example. By observing the agent’s behavior, the user might (or might not) form impressions of warmth and competence of it as well as other impressions that we did not take into account.
The assessment of the impressions will be done via subjective questionnaires administered after the interaction, as well as with the real-time detection of physiological states and multimodal behavior exhibited by the user during the interaction. The correlation among these various sources of user’s response will provide a rich set of information that goes in input to the agent to create a continuous interaction loop.

The agent manages impressions of warmth and competence on the user by exhibiting nonverbal multimodal behavior and adapting this behavior to the detected user’s behavior and physiological state during the interaction.

Interactive human-agent platform where the agent manages user's fist impressions.

B. Biancardi, A. Cafaro, C. Pelachaud, «Could a Virtual Agent be Warm and Competent? Investigating User's Impressions of Agent's Non-Verbal Behaviours«, Workshop «Investigating Social Interactions with Artificial Agents«, Glasgow, November 2017.
B. Biancardi, «Towards a Computational Model for First Impressions Management«, Doctoral Consortium of 19th ACM International Conference on Multimodal Interaction, Glasgow, November 2017.
B. Biancardi, A. Cafaro, C. Pelachaud, «Analyzing First Impressions of Warmth and Competence from Observable Nonverbal Cues in Expert-Novice Interactions«, 19th ACM International Conference on Multimodal Interaction, Glasgow, November 2017.
B. Biancardi, A. Cafaro, C. Pelachaud, «Investigating the role of gestures, arms rest poses and smiling in first impressions of competence.«, at Virtual Social Interaction Workshop, Bielefeld, German, July 2017.
B. Biancardi, A. Cafaro and C. Pelachaud «Gérer les premières impressions de compétence et de chaleur à travers des indices non verbaux.'', In Quatorzièmes Rencontres des Jeunes Chercheurs en Intelligence Artificielle (RJCIA 2017), July 2017, Caen, France.
B. Biancardi, A. Cafaro, C. Pelachaud, «Investigating User's First Impressions of a Virtual Agent's Warmth and Competence Traits.'', in Workshop Affect, Compagnon Artificiel, Interaction (WACAI), June 2016

In any encounter the first moments are critical and the impressions that we form of others matter. These impressions tend to last and affect the interaction experience.

The goal of this project is to build an anthropomorphic virtual character (ECA - Embodied Conversational Agent) able to make the best possible first impression on a user, thus effectively engaging him or her in an interaction. This goal will be realized by building an affective loop which ties the behavior of the ECA to the actual emotional reactions of the user facing it in real-time. This will give rise to an ECA capable of managing its first impressions on users. We focus on the identification and modeling of the nonverbal behavior, towards exhibiting, managing and maintaining impressions of two important socio-cognitive dimensions in the first minutes of interaction with a user. These dimensions are warmth (i.e. being friendly, agreeable, engaging and approachable) and competence (i.e. appear skilled, knowledgeable on a given topic).

The IMPRESSIONS project has humanistic and computational components. The analysis and modeling of nonverbal communicative behavior is drawn from existing literature in sociology and psychology, as well as from new data gathered from controlled user studies. The computational component implements those models in a virtual agent, resulting in believable and effective social behaviors.

Furthermore, user’s behaviors and physiological signals will help the agent managing the desired impressions. The analysis of multimodal signals is valuable to assess the user’s affective states, to determine the quality of the interaction, and, ultimately, for assessing the impressions that the user has formed of the agent. Results undergo a thorough user evaluation.

The use case scenario for this research has immediate impact in both the research community and a wide public audience. We envisioned a museum as context for our scenario. The agent will be installed in the museum’s hall with the purpose of exchanging a dialogue with visitors about the museum’s exhibit. Therefore the agent must be seen as warm and competent in order to obtain the visitor’s interest, positive judgments, and to enhance their experience of the museum.

The project builds on results from the PIA “Avatar 1:1 (A1:1)" project , it includes a partnership between the Swiss Center for Affective Computing and the LTCI-CNRS, and it envisions the agent deployment in the Natural History Museum of Neuchatel in Switzerland.

Project coordination

Catherine Pelachaud (Laboratoire Traitement et Communication de l'Information)

The author of this summary is the project coordinator, who is responsible for the content of this summary. The ANR declines any responsibility as for its contents.

Partner

LTCI, CNRS, Télécom ParisTech Laboratoire Traitement et Communication de l'Information
UniGe Univ. de Genève, Dpt. d'informatique & Centre Suisse de Sciences Affectives

Help of the ANR 136,760 euros
Beginning and duration of the scientific project: October 2015 - 36 Months

Useful links

Explorez notre base de projets financés

 

 

ANR makes available its datasets on funded projects, click here to find more.

Sign up for the latest news:
Subscribe to our newsletter