JCJC SIMI 2 - JCJC - SIMI 2 - Science informatique et applications

Extended Framework For In-time Computer-Aided Composition – EFFICACe

New paradigms for computer-aided music composition

The aim of the EFFICACe is the development of computer music tools, integrating a renewed conception of time, signal and interactions in compositional processes. The project builds upon an exploration of the relations between musical time and computation in computer-aided composition, based on OpenMusic and other technologies developed at IRCAM and at CNMAT.

Time, calculus and interactions

We consider computer-aided composition out of its traditional «offline« paradigm, and try to integrate compositional processes in structured interactions with their external context. These interactions can take place during executions or performances, or at the early compositional stages (in the processes that lead to the creation of musical material).

We address our general objective following a number of complementary perspectives such as :
- The development of reactive systems for computer-aided composition;
- The study of compositional practices and interactions;
- The study and enhancement of time structures for scheduling and computation in music;
- The control, visualisation and execution of spatialization and granular sound synthesis processes;
- Rhythm and symbolic time structures;
- Integration of gestural interaction and external devices in compositional processes.

These different points will be worked out fostering scientific and artistic interchanges and collaborations. We wish to develop international partnerships in order to maximize the expertise, but also the future distribution of our tools.

Our preliminary results include a working reactive extension of the OpenMusic environment available as an external library and to be integrated in forthcoming version 6.9.

Presentations in scientific and artistic events, e.g.:
- Quid sit musicus by Ph. Leroux ( festival Manifeste 2014, IRCAM)
- French-Brazilian Colloquium on Computer Aided Music Creation and Analysis
- Vienna Summer of Logic
- Séminaire Agorantic «Culture, Patrimoines, Sociétés Numériques«
- Live concert «Livre Digital - Frontiras Mausicais«, Campinas, Brésil, 08/2014.

Organization of scientific events and workshops (see repmus.ircam.fr/efficace/events)

Functional prototypes integrating computer-aided composition tools, user interfaces, time structures and soudn processing/spatialization technology will be released on our website.
Follow up of artistic and scientific collaborations for the conception and dissemination of our work.

- J. Garcia, J. Bresson, T. Carpentier: Towards Interactive Authoring Tools for Composing Spatialization, 3DUI'15: IEEE 10th Symposium on 3D User Interfaces, Arles, France, 2015.
- J. Bresson, J.-L. Giavitto: A reactive extension of the OpenMusic visual programming language, Journal of Visual Languages and Computing, 25(4), 2014.
- J. Bresson: Reactive Visual Programs for Computer-Aided Music Composition. IEEE Symposium on Visual Languages and Human-Centric Computing – VL/HCC, Melbourne, Australia, 2014.
- A. Vinjar, J. Bresson: OpenMusic on Linux. Linux Audio Conference, Karlsruhe, Germany, 2014.
- D. Bouche, J. Bresson, S.Letz: Programmation and Control of Faust Sound Processing in OpenMusic. Proc. International Computer Music Conference, Athens, Greece, 2014.
- J. Garcia, P. Leroux, J. Bresson: pOM - Linking Pen Gestures to Computer-Aided Composition Processes. Proc. International Computer Music Conference, Athens, Greece, 2014.

The objective of this project is to develop the concepts and to anticipate the tools needed for the emergence of the next generation of music composition systems.

Computer-aided composition research relies on the computer languages and formalisms to provide composers with expressive means to model or solve musical situations and to generate musical structures (scores, harmonies, sounds, etc.). In addition to the musical context, this research field spreads across numerous areas related to computer science: programming, mathematical modelling, human-computer interfaces and others. In particular, it is a very active application domain for end-user programming and visual programming languages.

The ambition of this project is to create a new generation of computer-aided composition systems adapted to contemporary musical practices and based on renewed concepts in computer science and music technology. Our purpose is to reduce and untangle critical antagonisms in the domain of computer music, in order to bridge and combine high-level composition systems with other computer music disciplines and frameworks such as signal processing, sound synthesis and spatialization. This connection between signal and symbolic approaches in music processing and representation is considered along with a duality between off-line versus real-time systems. These issues are therefore tackled through the standpoint of the representation and integration of time in the compositional structures: time in the data representation and processing, gathering macro- and micro-levels of musical structures, and time in the computational context, mixing the off-line computation of timed structures and the integration of real-time interactions. We propose to extend the state-of-the-art in our research domain with new computational and programming paradigms, merging the existing formal and declarative approaches with the reactive approach inspired from real-time interactive systems, and aim at creating a computer environment capable of handling these different dimensions of the musical structures and processes in a common, consistent and expressive framework.

The proposed research challenges the current approaches and categories in the field, in particular considering the cross representation, programming and execution of functional and time structures. It also includes substantial work on visual programming and musical interfaces through the development of an operational environment for composers, computer musicians and researchers.

Our team is constituted by researchers and engineers highly experimented in the different scientific domains involved in this project, and active in the contemporary music creation scene as composers or scientists.

Project coordination

Jean Bresson (Institut de Recherche et Coordination Acoustique/Musique / - UMR Sciences et Technologies de la Musique et du Son)

The author of this summary is the project coordinator, who is responsible for the content of this summary. The ANR declines any responsibility as for its contents.

Partner

IRCAM Institut de Recherche et Coordination Acoustique/Musique / - UMR Sciences et Technologies de la Musique et du Son

Help of the ANR 324,167 euros
Beginning and duration of the scientific project: September 2013 - 42 Months

Useful links

Explorez notre base de projets financés

 

 

ANR makes available its datasets on funded projects, click here to find more.

Sign up for the latest news:
Subscribe to our newsletter