How to dive into hybrid reality

By starting the content, you agree that data will be transmitted to www.youtube.com.
Data Protection Declaration

In our new video, the team of the Ars Electronica Futurelab shares its insights on its new video production system Deep Virtual. Here, every tiny movement of a film camera is precisely tracked and used to merge real and virtual world: Deep Virtual enables protagonists on the virtual stage of Deep Space 8K to interact with their immersive backdrop in real time instead of in post-production.

This way, Deep Virtual makes it possible to broadcast mixed realities live, providing viewers of the two-dimensional videos with a completely new impression of spatial depth. The complex system was developed to remotely immerse the audience into the virtual worlds of Ars Electronica without the need for 3D glasses. Both on site and via online streaming or telepresence robots, the audience can participate equally.

Deep Virtual was developed, tested and perfected over the course of producing eight episodes of the Anniversary Series for Futurelab’s 25th anniversary. In our new video, you can find out in six lessons, how we use interactive elements, artificial intelligence, protagonists, lighting and more in the Ars Electronica Center’s Deep Space 8K. Want to know more? Learn all about Deep Virtual in our first video!

, , , , , ,