DS0707 - Interactions des mondes physiques, de l'humain et du monde numérique

Creative Dynamics of Improvised Interaction – DYCI2

Submission summary

The "Dynamics of Creative Improvised Interaction" project involves the creation, adaptation and implementation of effective and efficient models of artificial listening, machine learning, interaction and on-line creation of musical content, to enable the establishment of digital musical agents. These autonomous and creative agents will be able to integrate in an artistically credible way diverse human settings such as active listening, live performance, production, gaming or pedagogy. They will also contribute to the perceptive and communicative performances of embarked artificial intelligence systems. The project puts forward improvised interaction as an anthropological and cognitive model of action and decision, as a scheme of discovery and unsupervised learning, and as a discursive tool for exchanging between humans and digital artifacts, in a perspective of modeling style and interaction.

Improvised interaction between humans and digital agents is a recent field derived from artificial creativity studies, based on the observation that the vast majority of human interactions are improvised. It raises several hot research issues : interactive learning , whose models are built at the very time of the interaction and whose results inflect the terms of this interaction ; artificial perception; and modeling of expressive/social interaction between human and digital agents, in their anthropological, social, linguistic, and informational dimensions. Improvised interaction involves the perception / action loop and engages learning in a renewed design where the agent learns from the reactions of other agents to its own productions. The learning model that incorporates the process of improvisation is both generative and reflexive .

To integrate artificial listening, temporal modeling of musical behavior and dynamic creative interaction structures into an effective architecture for real-world experiments is an ambitious challenge for the information and communication society: many rich potential applications could change the game in the relationship between humans and creative artificial agents in the context of cultural industries, including music production and post -production, video games, live performances, innovative audio formats, and new narrative schemes.

The project articulates these three major research issues in an experimental software environment by taking maximum advantage of the expertise of the partners and of their interactions. These three themes, each of which involve at least two collaborating partners, correspond to the theoretical abilities of a creative agent : informed listening meant to analyze the audio scene and to extrapolate the musical structures by exploiting the observed similarities and any available a priori knowledge; adaptive learning of musical structures meant to integrate modeling of formal sequences and probabilistic approaches in order to handle the complexity of the musical discourse and cope with necessarily limited data; and dynamics of improvised interaction meant to involve multi-agents models of knowledge and decision and put into practice innovative scenarios involving co-improvisation between human and digital actors.

Project coordinator

Monsieur Gérard ASSAYAG (INST RECH COORD ACOUSTIQ MUSIQ)

The author of this summary is the project coordinator, who is responsible for the content of this summary. The ANR declines any responsibility as for its contents.

Partner

UNIVERSITE DE LA ROCHELLE
UBO LABORATOIRE DES SCIENCES ET TECHNIQUES DE L'INFORMATION, DE LA COMMUNICATION ET DE LA CONNAISSANCE
Inria Institut de Recherche en Informatique et en Automatique
IRCAM INST RECH COORD ACOUSTIQ MUSIQ

Help of the ANR 500,000 euros
Beginning and duration of the scientific project: September 2014 - 36 Months

Useful links

Explorez notre base de projets financés

 

 

ANR makes available its datasets on funded projects, click here to find more.

Sign up for the latest news:
Subscribe to our newsletter