Swarm Compass: A Sea Change in Visualizing Events by Swarm Intelligence

photo1, Ein mögliches Szenario, in dem schwarmintelligente Beschilderung Menschen in die richtige Richtung führt. Credit: Ars Electronica Futurelab/NTT

The basic concept behind Swarm Compass is to navigate people by utilizing swarm intelligence and to present a totally new medium in the entertainment and communication service field. Going beyond simple signage to swarm-based social communication services is a task that Japanese telecommunications giant NTT and the Ars Electronica Futurelab have cooperatively invested in already by using the Spaxels with the Sky Compass. The swarm infotainment demonstrated by flying intelligent objects is a prototype designed to be used in Tokyo in the year 2020, when it will show visitors and residents the way to their destinations in a playful manner and will visualize exciting events. The audience at the Ars Electronica Festival witnessed the first steps in creating new types of social services based on swarm intelligence. In this interview, Hideaki Ogawa, creative catalyst, artist, curator and researcher in the field of art, technology, and society at the Ars Electronica Futurelab, gives an overview of recent advances in this research. And Shingo Kinoshita, head of the 2020 Epoch-making Project at NTT Service Evolution Laboratories in Japan gave us some insight into future NTT technologies for the year 2020.

How does the Swarm Compass differ from the Sky Compass?

Hideaki Ogawa: As shown by the research we performed last year on the utilization of the Spaxels®, they have reached a new dimension of movable objects which can influence our way of perceiving reality big time. Our research has now advanced beyond the point of utilizing them as a tool for navigation and signage, which we embarked on with the Sky Compass, a project the Ars Electronica Futurelab developed with NTT, one of the biggest Japanese telecommunication companies (http://www.ntt.co.jp/index_e.html). Just as we probed the possibilities of developing a so-called Sky Language, we now enter a new stage by asking the question “How can those flying vehicles work as a role model for swarm intelligence?”

Considering that Horst [Horst Hörtner, senior manager of the Ars Electronica Futurelab] established the Spaxels as quadcopters interlinked as flying building blocks to generate objects with material-visual characteristics (https://ars.electronica.art/aeblog/2014/09/10/smart-atoms/), the idea of exploring unmanned aerial vehicles (UAVs) as particles that comprise an intelligent swarm of robots was already established when the Spaxels Research Initiativewas introduced at the 2017 Ars Electronica Festival. (https://ars.electronica.art/ai/en/spaxels-research-initiative/) We also showed the state of development by staging a demonstration that resembled the physical presentation of the Sky Compass at NTT R&D Forum in February 2017. It utilized the same symbols and a number of scenes to give the audience a glimpse of what we mean by “Sky Language” and “User Responsibility Design”. That is still related to the concept of signage and visual communication addressed to the public.

36690498130_0595e6f8e8_k

Visitors of the Ars Electronica Festival 2017 could witness a demonstration of Spaxels performing the basics of what Sky-language is all about. Credit: Florian Voggeneder

Is there any new technology involved by which swarm intelligence will be manageable?

Hideaki Ogawa: As NTT has joined forces with the Ars Electronica Futurelab to animate Japan in the year 2020, we will flesh out the concept of swarm intelligence, which should provide a signage system and a system of flying screens that will provide spectacular renderings of events, such as in sports. Compared to the technology we have used to control the Spaxels so far, the challenge of creating a new spectacular sensation will be mastered via brand new software, SwarmOS, which has been developed by the Spaxels® R&D department. SwarmOS operates the mission autonomously from takeoff to landing, and the flight controllers just supervise the swarm during operation. They supervise the performance of the swarm via a SwarmControls 3-D graphical interface, and can manually override in real time, if required. SwarmOS design tools reflect a highly automated design approach that generates a collision-free path for each participating UAV in the swarm. This is a huge step from the original technology. It makes possible interaction between each object.

What’s the aim of the development in terms of practical application?

Hideaki Ogawa: From now on, we’re pursuing manageability so as to have an array of unlimited flying screens rendering an image of an event which, previously, could be seen only from afar. In this initial stage, we have developed the flying screen prototype, a large number of which will be flown in the future. Imagine a javelin thrower or a pole-vaulter that spectators in seats high up in the arena can see only as tiny figures, or only view a representation of them on an LCD video wall. In order to achieve a completely new experience, we operate aerial movie screens, which make it possible to share the excitement much more than by using what has been available to date. Swarm Display applications let you imagine swarm communication services. As spectators watch the dynamic movements of athletes provided by a swarm display, they will also have the option of tracing the movement by data delivered via big data, a field in which NTT has tremendous expertise. By combining the sport’s big data analysis and the expressive means of the swarm itself, we’ll create outstanding moments.

Is this concept of entertainment limited to a certain type of event or place?

Hideaki Ogawa: What we call Swarm Arena can actually be transported to any remote place in the world; it does not necessarily have to entail action in a sports arena per se. All those moments can be transported and relived. In public viewing, this experience is augmented by the actors involved. Compared to the original Spaxels, which were basically a combination of dots forming a picture, you are going to have a multitude of screens, each one of which displays a detail of a real-time event. That’s a totally new medium that will animate reality as scenery. We will continuously show our latest developments in the upcoming events. As a preliminary stage, we focused on the sky as canvas for the swarm of intelligent objects, but in the long run, we won’t just leave it at that; instead, we’ll broaden the approach to different sorts of canvases.

36948320056_062ec65eb9_k

Hideaki Ogawa explaining the group navigation of the swarm during the presentation of the Swarm Compass at Ars Electronica Festival 2017. Credit: Florian Voggeneder

Shingo Kinoshita, how would you utilize NTT’s advanced ICT for Swarm Arena?

Shingo Kinoshita: Concerning the realization of the Swarm Arena we see several challenges such as the communication between the swarm and the people, the dynamic and interactive control of large and heterogeneous swarms, which means everything will have to be done in real time, on-the-fly. We at NTT can contribute to this project by integrating our communication processing technologies like 5G, high density Wi-Fi and media synchronous delivery.

In particular, this would mean media processing like speech recognition, synthesis, intelligent microphones, image recognition and image processing. NTT also specializes in machine learning technologies and artificial intelligence like human flow prediction, navigation technology and quantum neural networks.

In the realm of the internet of things (ioT) we have developed a special device cooperation control (R-env) which will enable us to build spontaneous networks of internet enabled devices, sensors and even whole infrastructures. On the level of user experience design we will heavily cooperate with the team of the Ars Electronica to develop new interfaces for a high-level communication between humans and swarm enabled devices. Our robust real-time positioning system (RTK) will help us to do precise and dynamic human flow measurement to interface this data to our swarm enabled network of intelligent devices.

What would you like to bring to Japan in the year of 2020?

Shingo Kinoshita: We would like to realize an epic moment for the people of Tokyo. Our joint research and development with the Ars Electronica is not only aiming for the year of 2020 but for the years after 2020 as well. We at NTT would like to create a completely new domain of communication experiences for dynamic swarm enabled social and public entertainment services and set the basis for a new communication technology to appear the first time in the public sphere of Tokyo in the year of 2020.

15142003076_88c95f49e1_z

Hideaki Ogawa (JP/AT), Creative Catalyst, Artist, Curator and Key Researcher at Ars Electronica Futurelab. He has realized many projects for innovation with industries such as Honda R&D, Toshiba, Toyota and Hakuhodo etc. Specially, his focus is about Art Thinking to catalyze innovation. His leading project, Future Catalysts, is a creative and innovation base jointly-developed by Hakuhodo and Ars Electronica. Through “synergy” with distinctive worldwide innovators in the fields of art, science, and technology, the project produces new concepts, ideas, and strategies that serve as answers to various “creative questions”. In addition to the artistic innovation research, Hideaki Ogawa has realized international projects for festivals, export programs like Ars Electronica in the Knowledge Capital and the Ars Electronica Center. His special theme is about “Creative Catalyst” and “Robotinitiy – what is the nature of being a robot”. Hideaki Ogawa is also a representative and artistic director of the media artist group “h.o.”. He searches for witty new ideas depending on current social contexts, and is realizing artistic expressions with the speed of technological progress.

36331534813_957c9ba138_z

Shingo Kinoshita (JP), Executive Research Engineer, Supervisor, and Project Director, NTT Service Evolution Laboratories. He received a B.E. from Osaka University in 1991 and a M.Sc. with Distinction in technology management from University College London, UK in 2007. Since joining NTT laboratories in 1991, he has been engaged in R&D of distributed computing systems, security, big data computing, and machine learning. Shingo Kinoshita was a senior manager of the R&D planning section of the NTT holding company from 2012 to 2015, where he established and operated NTT Innovation Institute, Inc. in North America and managed R&D alliance and venture investments. He is presently in charge of the overall direction of various R&D experimental activities toward 2020 including assistant services for foreigners and entertainment services such as kabuki and SXSW. Shingo Kinoshita received the 2005 IPSJ R&D Award from the Information Processing Society of Japan (IPSJ), the 2003 CSS (Computer Security Symposium) Best Paper Award, the 1998 DICOMO Best Presentation Award, the 2017 Cool Japan Matching Awards Grand Prix, the 2017 Spikes Asia Innovation Category shortlist and Music Category shortlist, and the 2017 ACC Innovation Category New Technology Awards.

To learn more about Ars Electronica, follow us on FacebookTwitterInstagram et al., subscribe to our newsletter, and check us out online at https://ars.electronica.art/news/en/.

, , ,