Video featuring the Improtek project
The Musical Representations team in the STMS laboratory at IRCAM (UMR 9912 Sciences et technologies de la musique et du son, IRCAM, CNRS, Sorbonne University), under the direction of Gérard Assayag, is working on „cyber-human“ systems. Just as cyber-physical systems create a continuity between the digital logic of computers and the physical world by capturing, modeling, and modifying it, they establish a creative continuity between machines and musicians using mathematical modeling and artificial intelligence.
Artificial, Creative, and Autonomous Agents
The REACH project (pl: G. Assayag) puts the spotlight on a new concept of co-creativity in cyber-human systems, an emerging phenomenon in a framework of symbolic interaction between humans and artificial devices. This phenomenological approach neutralizes the metaphysical contents of terms such as creativity, intentionality, aesthetics, and emotion by creating objective conditions for the appearance of complex, distributed, and adaptive behaviors, in a two-way learning interaction created by the combined responses of humans and machines. The effects produced are therefore not completely reduced to the isolated production of each of the agents involved, characteristic of complex systems.
This innovative approach has led to the invention of several generations of technologies based on artificial listening, learning, and creative modeling, such as the software for improvisation Omax developed by the team along with its descendants, and continues under the direction of Gérard Assayag in the projects DYCI2 (Dynamiques créatives de l’interaction improvisée), MERCI (Réalité Musicale Mixte avec Instruments Créatifs) supported by the French national research agency, „Agence Nationale de la Recherche“, as well as the European project ERC Advanced REACH (Raising co-crEAtivity in Cyber-Human musicianship) supported by the European Research Council.
The examples selected for this video were produced with ImproteK (M.Chemillier, J. Nika), a variant of Omax, that adds the idea of a scenario to the mechanical “improvisation”.
The system used in Three Ladies (Tre donne in Italian) designed by Georges Bloch with the pianist Hervé Sellin makes it possible to combine different types of music through a scenario; in this case the chord chart of the jazz standard “The Man I Love”. The different forms of music used— excerpts sung by Piaf, Schwarzkopf and Billie Holiday—are „learned“ by the machine and comprise the „knowledge“ of the process which then „improvises“ by recombining this knowledge in real-time in order to follow the scenario. The virtual vocal presence of the great female singers mingle with Hervé Sellin’s piano in an original, and always new, interpretation of “The man I love”.
The memory built up by the creative agents can also be audiovisual, adding to the sound fantasy images of a performance that will never take place, as seen in the example of Tre donne video…
In addition, memory can be formed at the moment of the interaction by listening artificially to the musician, activating an instantaneous „double“ of the latter and allowing this avatar to become autonomous and enter into a totally new creative dialogue. This is illustrated by the creation of Rémi Fox and Jérôme Nika based on Booker T’s piece Rent Party.
This same technology is used in a modern version in the libDYCI2 environment (J. Nika), for Georges Bloch’s concert Three Ladies Project on September 5th presented online at Ars Electronica. In this concert the historical performance of the great singer Diane Reeves in Montreux is reconstructed in real-time by the software based on the harmony of the piano played live by Hervé Sellin! We have attained the objective of human-machine co-creativity first introduced by the Omax project.