Leading technologists and practitioners take the stage, to first share their speculative visions for the future, and then participate in a hands-on session to explore AI tools for the theater sector. We open with technologist Matthieu Lorrain, Creative Lead for AI & Creativity Research at Google DeepMind and co-founder of Liquid Logic, in a conversation with AC Coppens.
This hands-on session dives into the practical landscape of AI tools for theater and live performance. The format begins with a fast-paced series of short pitches—each presenter demoing their tool of focus. Nils Corte writes and directs interactive theater with AI-choreographed robots and holograms. Michael Rau presents LLM-based live scripting, where audience input generates real-time dialogue. Ali Nikrang shares how datasets shape AI-generated music, from Strauss-style waltzes to co-creative tools. Victorine van Alphen explores AI image generation as a poetic, introspective medium, shifting how we perceive bodies, identity, and presence. Silke Grabinger combines choreography with performative art and robotics, focusing on the anthropomorphization of technological tools. Pablo Palacio is a composer of electroacoustic and instrumental music who develops algorithmic approaches and new technologies for interactive music. His work explores the interaction between music and body movement, integrating concepts from fields such as Artificial Intelligence, biology, mathematics, and experimental psychology.
Participants will then break out into workshop clusters to explore the tools more deeply: How does the tech work? What are the creative and operational benefits? What datasets or systems fuel it? How might these tools meaningfully shape their practice? Designed to be interactive and inquiry-driven, this session invites participants to test, question, and reimagine how AI systems might support theater—as both as an art and an operational structure.