Japanese telecommunications giant NTT and the Ars Electronica Futurelab have been working together since 2017 on how to use drones—aka unmanned aerial vehicles—as a means of communication. The Sky Compass project laid the groundwork in 2017; now, Swarm Compass takes this initiative to the next level.
The mission is for swarm-based technology to carve out a place for itself in the entertainment & communications services field. What if, for instance, drone swarms could provide routing info to pedestrians? Or consider this: individualized city tours conducted by personal drones! How about if flying screens could add even more excitement to broadcasted events? The initial project, Sky Compass, focused the R&D effort on such areas as navigation and signage; now, in the follow-up project, Swarm Compass, NTT and the Ars Electronica Futurelab are developing even more complex possibilities for deploying swarm technology.
The originally developed technology was employed and fine-tuned to, among other tasks, control Ars Electronica’s SPAXELS® drone swarm. The presentation of the SPAXELS® Research Initiative at the 2017 Ars Electronica Festival and Sky Compass’s debut at the NTT R&D Forum in February 2017 gave the general public its first glimpses of how quickly development has proceeded.
And a lot has happened since then. Swarm Compass’ unmanned aerial vehicles are no longer just for navigation purposes; instead, they’re used as particles assembled to form an intelligent swarm of robots. An array of airborne screens that can interact and work together via swarm intelligence offers the possibility of spectacular renderings. For example, the movements of athletes in an arena could be broadcast in larger-than-life size and in motion on flying screens. The plan is for these ideas to make their public debut in Tokyo in 2020.
Bringing this vision to fruition calls for the corresponding technology. To automatically pilot drones, the SPAXELS® R&D Department developed SwarmOS, brand-new software that makes it possible for such missions to proceed completely autonomously from takeoff to landing. Flight Controllers merely supervise the swarm’s performance while it’s airborne and, if necessary, can manually override commands. This highly automated approach generates a collision-free flight path for every drone in the swarm while, at the same time, assuring that each flying object can interact with all the others.
Currently, the development process’ highest priority with regard to practical application possibilities is manageability. On one hand, Swarm Display applications make it possible to deliver swarm communications services; on the other hand, spectators also have the option of assessing, say, athletes’ movements with the aid of big data. The result: combining a big-data-enhanced analysis of an athletic event/performance with the expressive means of the swarm.
Read more in the Interview with Hideaki Ogawa of Ars Electronica Futurelab and Shingo Kinshita of NTT on the Ars Electronica Blog.
Credits
Ars Electronica Futurelab: Chris Bruckmayr, Horst Hörtner, Peter Holzkorn, Michael Mayr, Otto Naderer, Nicolas Naveau, Hideaki Ogawa, Benjamin Olsen, Jonathan Rutherford
NTT: Hiroshi Chigira, Kyoko Hashiguchi, Shingo Kinoshita, Kenya Suzuki