With Deep Virtual, the Ars Electronica Futurelab has developed the prototype of a unique new virtual video production system that precisely tracks the position and movement of a film camera and places virtual content in real time. In their immersive environment, the protagonists can thus interact with virtual content that seamlessly merges with real people and objects. The 2D videos convey a completely new sense of spatial depth without the need for 3D glasses.
Originally, virtual production systems come from Hollywood film studios and are used to enhance TV or film productions with additional layers of information and visual content. Virtual and real worlds are usually combined there in post-production. Deep Virtual allows this merger to happen in real-time. In Deep Space 8K, VR research lab 3D projection space at Ars Electronica, eight cameras track for this purpose every movement. The resulting images are combined with the virtual world via a virtual production pipeline in a real-time sandbox, matching perspectives. This creates a sense of spatial depth comparable to the impression created by stereoscopic images. Deep Virtual was conceived to enable viewers immersing into in the virtual worlds of Ars Electronica and acts as a platform for hybrid live presentations: Both on-site in Deep Space 8K and online via stream or telepresence robot, the audience can equally participate.
Deep Virtual was developed, tested and refined during the production of eight episodes of the Series for Futurelab’s 25th anniversary. After a major upgrade it Deep Space 8K, the lab is now focused on developing different participatory formats for hybrid audiences. In any case, the transformation of the static virtual environment into a more interactive and reactive backdrop is already serving as an inexhaustible source of inspiration for the Lab. Learn more about our innovative concepts for hybrid environments in Virtual Worlds!