Virtual Anatomy

Searching for an immersive environment for illustrating human anatomy, the Ars Electronica Futurelab developed a completely new form of visualization of anatomic measurement data, in cooperation with researchers at Siemens Healthineers. The result is Virtual Anatomy: Here, MRI and CTR data of real patients is merged into photorealistic three-dimensional images of human anatomy. Organs, blood vessels, muscles, tendon and more can be viewed larger than life as three-dimensional, razor-sharp objects from all angles; data can be faded in and out at the click of a button.

The project was so successful, that in 2021 another milestone was set: Students at the Kepler University in Linz are now able to experience Virtual Anatomy full-time, at the new JKU medSPACE. This also marked the transition from the prototyping phase to a full product, offered worldwide by the Ars Electronica Futurelab. You can find all the information on the JKU medSPACE on our project page.

By starting the content, you agree that data will be transmitted to youtu.be.
Data Protection Declaration

How Virtual Anatomy works

With Virtual Anatomy, users can navigate anatomical anatomy data as they wish, seamlessly shifting through different layers, cutting in and out again, zooming to any area at any angle. This is accomplished by combining two programs: Siemens HealthineersCinematic Rendering and Virtual Anatomy by the Ars Electronica Futurelab work hand in hand. Virtual Anatomy is based on the Cinematic Rendering SDK (Software Development Kit), developed by Siemens Princeton.

Cinematic Rendering is used to import the MRI and CRT data, to anonymize the data and set the initial keyframes. Also, meta data for the visualisation is generated. Virtual Anatomy further processes the data and adds important information like positioning, scaling and rotation of the 3D pointer. Virtual Anatomy then fine tunes and displays the data.

Virtual Anatomy uses the Unreal Engine for the visualization of the renderings, which communicates directly with the SDK. Additional settings in the engine allow optimized stereo rendering especially for large displays and walls.

The development process

It all began with artists and researchers looking for ways to transfer the power of animation from the world of cinema to that of medicine. Image-based illumination calculation, a new method that made it possible to transform CT and MRI measurement data into a plastic, photorealistic representation, caused this revolution.

The impressive results unfolded their full potential in Ars Electronica’s Deep Space 8K in 2015. A broad public was now able to experience the three-dimensional anatomic world in a 16 by 9 meters 8K environment. From there, the program expanded drastically: from regular lectures for university students to a new communication platform for top physicians as well as livestreams of surgeries.

The program in all its forms was so successful that the Johannes Kepler University in Linz decided to extend it even further – by building its very own projection room. So the JKU medSPACE opened its doors: a globally unique way to experience anatomy at a university. You can read all about the JKU medSPACE on our project page.

By starting the content, you agree that data will be transmitted to youtu.be.
Data Protection Declaration

Credits

Ars Electronica Futurelab: Roland Haring, Florian Berger, Friedrich Bachinger, Patrick Müller, Otto Naderer, Erwin Reitböck, Johannes Pöll, Kerstin Blätterbinder, Marianne Eisl, Andreas Pramböck

Medical-Scientific Director
: Prim. Univ. — Prof. Dr. Franz Fellner (AKH Linz/Department of Radiology)
PARTNER: Siemens Healthineers; Johannes Kepler Universität (JKU); Kepler Universitätsklinikum (KUK)

Related Projects

Immerse yourself in our work

Interested in similar projects? The following Ars Electronica Futurelab projects are related to the ideas and concepts presented here. An overview of all our productions, cooperations and projects can be found in our project archive.