Since 2013, the Ars Electronica Futurelab and Mercedes-Benz have been facing one of the most challenging issues raised by autonomous mobility: How can we humans communicate with self-driving cars in ways that make us feel comfortable and safe?
To approach this question, the Ars Electronica Futurelab set up two interactive proving grounds in which the so-called shared space, the zone that will be used by both human pedestrians and robotic motorists in the future, can be experienced by test subjects. The first of these haptic simulation setups, the Shared Space Spaxels, made its public debut in summer 2014 at the Mercedes-Benz Future Talk Robotics.
As already suggested by the name, the Spaxels developed at the Ars Electronica Futurelab are deployed to great advantage in this project. These quadcopters are equipped with LED modules and an infrared tracking system with which their position in space can be specified down to the millisecond. In this 8-by-8-meter interaction zone, three Spaxels and up to three human interaction subjects simulate various traffic scenarios. In them, the quadcopters communicate via light signals or predefined in-flight motions with the humans in their immediate surroundings. These people, in turn, can “talk” to the flying robots via haptic interface or physical gestures—for instance, summoning them with an upraised arm or dispatching them to a particular parking spot by pointing their finger towards it.
Shared Space Spaxels: Interaction by Gestures
Our initial testing revealed that communication by means of gestures entails several challenges. Cultural differences and comprehension problems arising when the camera angle is unfavorable are only two such issues. Nevertheless, simple arm & finger motions—for example, projecting the palm forward to mean STOP—have proven themselves as intuitive means of communication.
Especially important, in any case, is the necessity for mobile robots to be able to communicate with us in unambiguous terms. Proactive communication on the part of self-driving cars—that is, promptly signaling their states, processes and present intentions—constitutes an essential precondition for pleasant-human-machine coexistence. And this is one of the essential ideas that went into configuring the Shared Space Spaxels—for example, in an interactive scenario in which a test subject crosses a quadcopter’s flight path. Depending on the test conditions, the approaching Spaxel acts in a number of different ways—either it brakes without having previously announced its intention to the interaction partner, or it communicates proactively in advance via a green signal that the human pedestrian has been recognized as such and can safely cross. Alternatively, the Spaxel can switch into Reception mode and react to the test subject’s STOP gesture.
Shared Space Spaxels: Magic Car Key
On the whole, the Shared Space Spaxels convey an idea of how it will feel to be a pedestrian amidst smart autonomous robots. And even if our Spaxels are, of course, no match for real cars in terms of weight and size, they nonetheless constitute practicable proxies, especially in this early exploratory phase of R&D. Flying at a speed of up to 60 km/h and an altitude that we purposely set at shoulder-height to the test subjects, they make a strong haptic impact. These are key factors that make the Shared Space Spaxels an interesting alternative or supplement to virtual simulation environments.
Shared Space Spaxels: Collision avoidance
Credits
Research & Development: Roland Haring, Christopher Lindinger, Alexander Mankowsky, Martina Mara