The pharus tracking system was developed at the Ars Electronica Futurelab by researcher and artist Otto Naderer to provide interaction opportunities for any size group of people in virtually any environment. The special feature of this scalable system lies in its user-friendliness: neither markers nor other devices are required to be recognized by pharus – people are already detected by entering the controlled area. pharus thus meets a wide variety of requirements; be it for the development of complex programs in Deep Space 8K at the Ars Electronica Center, participatory art performances or even the monitoring of assembly lines. Thus pharus can also be used in a variety of ways by partners of the Ars Electronica Futurelab.
Since the late 1970s, experts have been focusing on the development of improved interaction possibilities between humans and machines under the term “Human Computer Interaction” (HCI). Increasingly, however, the focus of researchers is shifting from the development of interfaces for individuals to large groups of people: “Crowd Computer Interaction” is a specific subfield that has shaped HCI research since 2009. The CADET (Center for Advances in Digital Entertainment Technologies) research project conducted by the Ars Electronica Futurelab and the Salzburg University of Applied Sciences from 2010 to 2014 is therefore also focusing on unconventional and intuitive forms of human-machine interaction. Research is being conducted on interfaces and applications that can be controlled by multiple people – for example, on the basis of motion data.
In order to be able to identify the position of a moving subject or object in space and to recognize movement patterns, a local positioning system was developed at the Ars Electronica Futurelab. The demands on the mechanism were conceivably high: In addition to its insensitivity to massive exposure to large groups of people, it had to simultaneously capture a correspondingly large interaction space and still operate with precision.
The tracking system, developed for this purpose by artist and researcher Otto Naderer, uses laser scanners (2D planar LiDARs) mounted around an interaction surface at ankle height. The first practical test was already able to demonstrate the enormous potential of this system: Multiple perspectives provided by the sensors reduce the effects of mutual occlusion in the room. Up to 30 people can thus move simultaneously in a scalable interaction space in the Ars Electronica Center’s Deep Space 8K. pharus tracking was conceived!
Years of continuous development work have now extended pharus (Latin: ‘lighthouse’) into an incredibly versatile, highly precise, and very responsive tracking system that tracks people and objects over long distances even in crowded environments. The multifaceted system has already been able to successfully cope with a wide variety of different challenges.
How pharus’ tracking works
Depending on the application, the scalable tracking system is based on less or more 2D laser rangefinders. The sensors usually have a rotating head equipped with an infrared laser source and a photodiode. By measuring the time of flight – from the emission of a laser pulse to its reception at the diode – the distance to the reflecting object can be calculated. This similarity to radar has given rise to the term “LiDAR” (Light Detection and Ranging). The measuring device is mounted on a rotating base; the sensor records the contour of its surroundings.
These contours of each of the connected sensors are collected via the pharus processing pipeline. The measurement data of all sensors, also called “echoes”, are overlaid and evaluated: An essential part of this process, compensates side effects of mutual shielding: if one or more sensors should lose line of sight, others can “take over” the person or object to be tracked. Even if all sensors should lose the track, pharus can predict the course of the motion for a certain time and thus compensate for dropouts.
pharus in Deep Space 8K
In the Ars Electronica Center’s Deep Space 8K, pharus tracking is used for many different applications:
Game Changer Suite is a collection of games designed for the participation of multiple users. With maximum physical effort, players – represented by their virtual avatars – can compete against each other in adventurous scenarios. Originally developed by students of FH OÖ Campus Hagenberg as a temporary installation for the 2014 Ars Electronica Festival, the Game Changer Suite has since been part of the interactive program in Deep Space 8K. It invites visitors to participate by picking from a series of fast-paced and agile mini-games to experience for themselves what possibilities the system offers and how it feels to be digitally “tracked.”
Based on the concept of “Cooperative Aesthetics”, developed by Univ.-Prof. Dr. Gerhard Funk, students of “Time-based and Interactive Media Art” at the University of Arts and Design Linz and Prof. Funk have created 30 artworks for Ars Electronica Center’s Deep Space 8K since 2015. Using the pharus laser tracking system the artworks enable visitors to have a collective, audiovisual, aesthetic experience in Deep Space 8K. These projects are presented on Ars Electronica Home Delivery.
pharus – On mission for business, arts and culture
The range of applications for pharus is diverse and is by no means limited to Deep Space 8K and the Ars Electronica Center:
In a building bridge connecting two main buildings of the SAP Campus in Germany, the interactive sound installation Building Bridges was installed using pharus. Here, the movements of pedestrians are translated into music by the system using a music composing algorithm. The bridge serves simultaneously as a stage and a musical instrument. By simply crossing it, people can create their own piece of music. Together with composer Rupert Huber, the Ars Electronica Futurelab realized the project in 2013 at the SAP Campus in Walldorf, Germany.
In the area of tension between control and self-determination in the age of digital surveillance, the performance SystemFailed uses Ars Electronica Futurelab’s pharus tracking to question our ability to act in the face of a seemingly omnipotent digital system: together with the audience, the artist collective ArtesMobiles examines the system of power in an increasingly digitalized world with its performative production. An AI thus becomes the protagonist in the play, which collects and neatly sorts movement data of all subjects in the course of SystemFailed. Systems controlled by algorithms are pondered in a playful way, and the participants soon have an interesting self-experience: Will they adapt? Or will they rebel?
pharus also explores new territory with the development of Rotax MAXDome unique e-kart experience parkours. The key feature of this track is a 50-meter tunnel equipped with a floor projection system. During their fast ride, racers can collect additional “boosts” by mastering challenges. The fast and agile karts require a highly precise, but also very responsive tracking system. Here, too, pharus was able to meet all requirements. In this setup, it works in conjunction with a coarse radio-based positioning system for identification, providing high accuracy and scan rate when needed.
Australian media artist Daniel Crooks also approached the Ars Electronica Futurelab with a very demanding challenge as part of his 2014 Artist Residency: in a “scan room,” a fully virtual sculpture was to be created from a single person’s movement: his artistic work focuses on the treatment of time as physical substance. With Real Imaginary Objects, Ars Electronica Futurelab researchers dealt with the development of this space: using four sensors, they developed an Object Slicer; – a pharus that calculated the cross-section of a detected body and can stack it into 3D models in real time. The results of this artistic research have been shown in numerous places around the world.
pharus – On track for a mobile future
pharus‘ usability on mobile platforms is currently being tested: In order to gain a better understanding of their environment via laser tracking, autonomous vehicles could be equipped with particularly compact sensors.
In order to support the recognition or differentiation of people and obstacles in terms of route planning and safety, pharus is also used for Ars Electronica Futurelab’s Cobot Studio project, where research is being conducted to optimize the coexistence of humans and machines.
And, pharus offers even more potential for swarm research: Bots equipped with pharus, would be able to recognize and locate neighboring vehicles. This allows conclusions to be drawn about their own position within the swarm, supporting accuracy and reliability of the positioning system on the bot.
Research & Development: Otto Naderer
Immerse yourself in our work
Interested in similar projects? The following Ars Electronica Futurelab projects are related to the ideas and concepts presented here. An overview of all our productions, cooperations and projects can be found in our project archive.