CE33 - Interaction, robotique

Mixed Musical Reality with Creative Instruments – MERCI

Mixed Musical Reality with Creative Instruments

Improvisation is a primary driver of human interaction in all aspects of communication and action. This project proposes to radically renew the paradigm of improvised human-computer interaction by establishing a continuum from co-creative musical logic to a form of «physical interreality« (a mixed reality scheme where the physical world is actively modified) anchored in acoustic instruments, the «creative instruments«.<br /><br />Translated with www.DeepL.com/Translator (free version)

Improvisation can be seen as a major driving force in human interactions, strategic in every aspect of communication and action. In its highest form, improvisation is a mixture of structured, planned, directed action, and of hardly predictable local decisions and deviations optimizing adaption to the context, expressing in a unique way the creative self, and stimulating the coordination and cooperation between agents. An invaluable observation deck for understanding, modeling and promoting co-creativity in a context of distributed intelligence, Improvisation is an indispensable ability that any cyber-human system should indeed cope with in an expert way. Improvisation is instantiated in its most refined form in music, where the strongest constraints govern the elaboration of highly complex multi-dimensional, multi-scale, multi-agent actions in a cooperative and timely fashion so as to achieve creative social and cultural cooperation.<br />Setting up powerful and realistic human-machine environments for improvisation necessitates to go beyond the mere software engineering of creative agents with audio-signal listening and generating capabilities, such as what has been mostly done until now. The partners, Ircam STMS Lab, EHESS Cams Lab, and HyVibe startup company propose to drastically renew the paradigm of human-machine improvised interaction by bridging the gap between the computing logics of co-creative musical agents and mixed reality setups anchored in the physics of acoustic instruments. In such setups of “physical interreality” (a mixed reality scheme where the physical world is actively modified by human action), the human subjects will be immersed and engaged in tangible actions where full embodiment in the digital, the physical and the social world will take place thanks to a joint effort gathering experts from a large inter-disciplinary spectrum.<br />The main objective of this project is to create the scientific and technological conditions for mixed reality musical systems, enabling human-machine improvised interactions, based on the interrelation of creative digital agents and active acoustic control in musical instruments. We call such mixed reality devices Creative Instruments.<br />Functionally integrating creative artificial intelligence and active control of acoustics into the organological heart of the musical instrument, in order to foster plausible physical interreality situations, necessitates the synergy of highly interdisciplinary public and private research, such as brought by the partners. Such progress will be likely to disrupt artistic and social practices, eventually impacting music industry as well as amateur and professional music practices in a powerful way.

Considering the research context and preliminary results, we wish to optimize research productivity and risk containment by operating simultaneously in two symmetrical and converging directions which will eventually join in the inception of the Creative Instrument.
Augment the cyber entities abilities with enhanced computational creativity, autonomy and multimodal sensitivity to context, as well as with cyber-physical actuation inside the instrument, so they can enter into a convincing expert interplay with humans operating the musical instrument.
Augment the human abilities by interfacing them into mixed reality by the mean of embodied instrumental interfaces which will augment their musical awareness and efficiency, so they can interact in a natural and creative way with artificial agents.
The articulation of these two approaches through a series of research / development cycles fed by a series of more fundamental research tasks, providing the necessary modelling tools and social science assessment and data, will lay the research eco-system ground in order to aim at the realization of the first prototypes of Creative Instrument. In order to coordinate this multi-disciplinary approach in time, we will launch two clusters of research work packages which will interact throughout the duration of the project:
machine learning research providing trained models, and social science field research providing human data and heuristic guides for model parametrization.
—develop experimental platforms for mixed reality and computational creativity software, joining together the two augmentations (a) and (b) in Creative Instrument prototypes.
These two clusters will be complemented and articulated by (3) an Experimentation, Usage and outreach package impulsing the timing of a series of iteration between modelling, prototyping and experimentation tasks. Going through fast research and development cycles favoring interaction between all work packages, with milestone deliverables produced at the end of each yearly cycle, will smooth-out the interdisciplinary cooperation and prevent unforeseen problems that could potentially seize up the organization of the project. The Work Plan subsection below details this program’s precise package and sub-task breakdown.
Inside this program, we will jointly use methods drawn from (1) Interactive computational creativity, (2) active acoustic control of music instruments (3) machine learning and data-science of musical information, and (4) social sciences and anthropology of improvised practices.

One of the practical output of the project will be the realization of the first concrete prototype of Creative Instrument based on the HyVibe Guitar, by expanding it with mixed reality and interreality features and by equipping it with creative digital agents , as a prelude to extension to other instrument families. Novel applications for interacting with human musicians will be implemented in the instrument and evaluated, such as : creative backing tracks — adding versatile accompaniment in constrained styles; creative co-improvisation — where the instrument freely creates autonomous content or doubles the musician’s playing with ever changing and coordinated musical lines; creative looper — with record/loop processes which evolve by themselves to new generative contents; creative orchestration — blending in sophisticated arrangements and orchestrations. These are only a few examples of how the Creative Instrument will potentially bring about the long-awaited convergence between physicality, information processing and creativity, in a manner that will be maneuvrable and enjoyable by all, at all levels of skills and expertise.
In addition to instrument prototypes, the project will provide theoretical foundations, technologies, software suites and experimental data made available in open-access to the research community as well as to other communities (e.g. augmented instrument design). These products will be the result of confronting several scientific challenges, including: creation of a model of knowledge and decision making for synchronous and asynchronous interaction strategies of agents involved in individual and collective improvised interactions ; augmentation of human abilities through mixed reality instrumental setups with embedded creative artificial intelligence ; improvised behaviours observation and human data collecting and sharing. In the course of this project, we wish to finally understand better the processes of cyber-human co-creativity , enrich the users’ perceptual, emotional and social experience and energize their individual or collective experience of improvisation.

Beyond known cyber-physical systems that create a continuity between the digital and the physical world, Creative Instruments constitute a cyber-human experience by bridging human and computational creativity through the mediation of a combination of material tools such as augmented acoustic instruments and of digital ones such as generative learning and artificial intelligence .
The research ecosystem to be put in place needs us to collaboratively answer a series of research questions :
• How to augment the expertise of digital agents with enhanced computational creativity and connectivity so they can enter into a more convincing interplay and co-creativity with humans
• How to augment the capacities of digital agents with physical extensions into acoustic musical instruments, so they can foster embodied interactions with humans
• How to augment the human possibilities by interfacing them into mixed reality set-ups anchored in musical instruments with creative agents so they can boost their individual and relational creative potential.

These questions are essential for computers to be able to process in real time the complex multi-variate time signals exchanged by humans and cyber-entities. These signals — especially in the case of music — carry implicit multidimensional and multi-scale semantic structures that are difficult to delimit correctly and efficiently, but are nevertheless paramount to the proper functioning of artificial perception and music generativity. Another difficulty lies in the gathering of these functionalities into a single object, the musical instrument, a device emblematic of the highly creative and aesthetic relationship between human sensory-motor system, memory, imagination and intentionality.
We will pay great attention to careful design of the system’s musical functionalities and of the perception / action loop it supports, so as to regulate the appropriate feed-back processes between the human performer, the physical mechatronics and the creative software agents, triggering in return productive cross-learning mechanisms. In order to fully understand these issues in the context of human practices, we will also enforce the study of the power of such instruments as driving forces in the process of knowledge acquisition and music creation, considering the social traces left in the process of their appropriation and sharing and the affordances that shape their usage. This dimension of the project related to instrumentality, embodiment and interfaces will require expertise from the social science domain as well.

The project will produce deliverable (scientific reports, software and hardware prototypes, international Workshops), PhD thesis, Master's thesis, journal papers, conference papers, book and book chapters, general audience dissemination (press, radio, multimedia, web, social networks), artistic creations (concerts, music pieces), talks, keynotes and presentations at academical or industrial events.

Improvisation can be seen as a major driving force in human interactions, strategic in every aspect of communication and action. In
its highest form, musical improvisation is a mixture of structured, planned, directed action, and of hardly predictable local decisions
and deviations optimizing adaption to the context, expressing in a unique way the creative self, and stimulating the coordination and
cooperation between agents. Setting up powerful and realistic human-machine environments for improvisation necessitates to go
beyond software engineering of creative agents with audio-signal listening and generating capabilities. This project proposes to
drastically renew the paradigm of human-machine improvised interaction by establishing a continuum from the logics of co-creative
improvising agents to a form of “physical interreality” (a mixed reality scheme where the physical world is actively modified)
embedded in acoustical instruments involving full embodiment for musicians.

Project coordination

Gérard ASSAYAG (INST RECH COORD ACOUSTIQ MUSIQ)

The author of this summary is the project coordinator, who is responsible for the content of this summary. The ANR declines any responsibility as for its contents.

Partner

IRCAM INST RECH COORD ACOUSTIQ MUSIQ
Hyvibe HYVIBE
UCSD / CREL
CAMS Centre d'analyses et de mathématiques sociales

Help of the ANR 689,030 euros
Beginning and duration of the scientific project: January 2020 - 36 Months

Useful links

Explorez notre base de projets financés

 

 

ANR makes available its datasets on funded projects, click here to find more.

Sign up for the latest news:
Subscribe to our newsletter