Huge amounts of data. A distance of 9223 kilometers. No transmission losses. And all this via a public connection – the task that the Ars Electronica Futurelab and NHK have chosen to undertake is anything but easy. Still: As part of the EU research project Immersify, the world’s first transcontinental 8K Live Stream via public link will take place on August 28, 2019 at 09:00 AM. While in Tokyo, dancers are performing the traditional Sanbasō, visitors in Linz can join to watch it live – right at the 9x16m screen of Deep Space 8K.
For this interview, we sat down with Roland Haring and Ali Nikrang from the Ars Electronica Futurelab to learn more about the 8K Live Stream, Immersify and the challenges of such a project.
(And for all those who will only be in town during the 2019 Ars Electronica Festival: On Friday, September 6, 2019, at 10:00 AM, a recap and two further 8K Live Streams will be shown at Deep Space 8K)
Roland, you’re preparing something very special right now. What will happen on 28 August 2019 at Deep Space 8K in Linz?
Roland Haring: We are very proud to be celebrating a world premiere this year in the run-up to the Ars Electronica Festival: On August 28, 2019, the first intercontinental 8K Live Stream will take place at Deep Space 8K, which will be broadcast via the public Internet. It’s a premiere in many ways – we’ve never seen such high quality live streams before. It’s a big challenge to record and stream such footage, transmit it over a public network and then display the received data.
This was made possible by the EU project Immersify, where the right partners got together and brainstormed what could be done. In addition to our European partners, we were also able to attract NHK from Japan, a world leader in broadcasting. Especially when it comes to 8K, they are pioneers – we are talking about a really big complexity with this data volume, especially when it comes to live technology. In Japan, there is already a limited 8K broadcasting operation!
What will you be showing during the Live Stream on August 28th?
Roland Haring: On this date, the Japanese dance performance Sanbasō will be presented at the Tokyo National Theater. It’s a very traditional performance and touches deeply on the roots of Japan’s culture – and that’s exactly what we’re going to livestream to Linz.
What challenges do you face in this big streaming project?
Roland Haring: On the one hand, you of course have to record the events and then encode them into a data stream. The next challenge is to send this data halfway around the world without any losses or picture interruptions. Usually such a project would be realized via dedicated network links, where TV stations rent bandwidths from Internet providers and use them exclusively for a certain period of time. But we do it differently: We do not reserve our own bandwidth, but send our data via the public infrastructure.
According to the motto: Why take the easy route if there’s a complicated one as well…
Roland Haring: It’s a proof of concept. Immersify plays a pioneering role; it is intended to realize a wide variety of scenarios in an exemplary manner in order to show what high-quality, high-resolution and immersive media situations can already be realized with the existing technical infrastructure. Of course, streaming also plays a role here. The fact that we use the public Internet for this purpose makes the technology not only tangible for media producers or other interested parties, but also generally increases availability.
Ali Nikrang: What is more, when we tested it we had one receiver in Linz, one in Berlin and one in Poland. So the signal that came from Japan was available at three different locations in Europe. And another thing that is often forgotten: This is all the more impressive when you consider that 8K is not the same as two times 4K. In reality it is four times as much as 4K! That is the volume we are talking about here. It’s very challenging when you transmit a signal live over so many kilometers. Conversely, if you download a file from the Internet, it is not critical whether the download sometimes runs faster or slower. When we broadcast something live, it has to be guaranteed that everything arrives regularly. Of course, using the public Internet to do this is risky in this respect.
You need a special receiver, the infrastructure is not yet fully developed – is 8K still the future of television?
Roland Haring: I think so. Usually, such technical developments happen according to the same scheme. First, video codecs are specified, then there are the first tests and software implementations, finally there are optimizations and high-performance implementations. This is not yet very interesting for end users, because you still need very powerful hardware. The next step is where it gets exciting, when codecs are implemented as chips in hardware. These chips can be found in every mobile phone, in computers and on graphics cards – they are very common and no special devices are needed. What we can see is that this is exactly the kind of development the industry is currently working on. Maybe it will take a while, but 8K will be as normal as HD is now.
On the other hand, playing content in 8K is no problem at Deep Space even now.
Ali Nikrang: Right. We’re even experimenting with 16K in Deep Space! What’s difficult about the Live Stream is the transmission. How do you manage to transmit and receive the data in time and evenly, despite the public Internet, over which you have no control?
Roland Haring: To make things a little easier, we have a cooperation with LIWEST. They enable us to close the last mile into the Ars Electronica Center for this project. Internet service providers like LIWESTare interconnected via large fiber optic nodes. Every end customer must rent a line from the Internet provider of their choice so that they can also connect to the Internet themselves. Everyone has to do this at home, and Ars Electronica as a company also has to do it. If we had run the 8K Live Stream over our usual Internet connection, that would have been problematic – it is already very busy in everyday life, since we often need the Internet for our work. For this reason, LIWESTprovided us with a second Internet line, which is a normal public connection, but runs separately from the first Ars Electronica line.
After the big event at the end of August, the Ars Electronica Festival will also be hosting a Live Stream. Can you tell me more about it?
Roland Haring: On Festival Friday, September 6, 2019, we’ll be showing a summary of our live stream of the dance performance at Sanbasō, as well as a live stream to Japan and Poland.
Ali Nikrang: The livestream to Japan uses the same technology that we will be using on August 28. However, with our Polish partner PSNC, we are using a different technology. They don’t work with special codecs; they only use freely accessible software. So it’s something everyone could use without special licenses for decoding or encoding.
Singing Sand 2.0 / Tadej Drolc. Credit: Robert Bauernhansl
The Live Stream is not the only premiere that will be presented by Immersify at this year’s festival. What can we expect?
Roland Haring: A big highlight is certainly the presentation of “Singing Sand 2.0” by Tadej Droljc. The Slovenian media artist has created an 8K stereoscopic animated film that shows abstract visualizations of music and sound on the floor and wall of Deep Space 8K.. A very impressive experience. Theresa Schubert, artist in residence at our partner PSNC, shows “Immersive Minimalism”, a first result of her stay in Poland. We also show a big cooperation with the BBC Studios and the Scan Lab from Great Britain about the pyramids of Giza. They have scanned the famous world cultural heritage sit three-dimensionally and created 360° videos on this basis, which we link to an interactive application in Deep Space 8K. You can freely control how you move through the narration and the pyramid. Because it’s a 360-degree work, we work with an extreme resolution – you only ever see part of the video. For it to be high resolution, the whole video must have more than 8K – we are really at the limit of our hardware system here. We don’t know that such a project has already been realized somewhere in such a form, it’s a real highlight.
Roland Haring studied Media Technology and Design at Hagenberg University of Applied Sciences. Since 2003 he has been a member of the Ars Electronica Futurelab and one of the driving forces behind the lab’s R&D efforts. His activities include research and development in several large R&D projects with academic, artistic and commercial partners and collaborators. Currently Roland Haring is the Technical Director of Ars Electronica Futurelab and co-responsible for its general management, content conception and technical development. With his many years of experience in the (software) technical management of large-scale, research-intensive projects, he is an expert in the design, architecture and development of interactive applications.
Ali Nikrang is a senior researcher & artist at the Ars Electronica Futurelab, where he’s a member of the Virtual Environments research group. He studied computer science at Johannes Kepler University in Linz and classical music at the Mozarteum in Salzburg. Before joining Ars Electronica’s staff in 2011, he worked as a researcher at the Austrian Research Institute for Artificial Intelligence, where he gained experience in the field of serious games and simulated worlds.
The 8K Live Stream of the dance performance Sanbasō will take place on August 28, 2019 at 09:00 AM at Deep Space 8K at the Ars Electronica Center. At the Ars Electronica Festival a week later, on September 6, 2019, at 10:00 AM, there will be another opportunity to experience an 8K Live Stream. Find the rest of the Immersify program at the Ars Electronica Festival in the festival program (click here to download).
To learn more about Ars Electronica, follow us on Facebook, Twitter, Instagram et al., subscribe to our newsletter, and check us out online at https://ars.electronica.art/news/en/.