As humans and robots work together ever more closely, their joint success is linked to certain preconditions: How do you create safe working environments? How can we increase the acceptance of robots in everyday work? And how do you communicate with a colleague who consists only of a gripper arm? In CoBot Studio, seven Austrian research institutions jointly investigated from 2019 to 2022 which robot signals are understandable in which work environments for what group of people.
Interdisciplinary approach to interaction
Unlike conventional industrial robots, that are usually only used behind barriers for safety reasons, collaborative “cobots” are lightweight and safe enough to work in close proximity to humans. Increasingly, however, collaboration between humans and robots raises questions: How can the machine communicate that it is waiting for input, and how can one estimate in advance how its robotic arm will move?
From robotics to psychology and nonverbal communication to virtual reality, CoBot Studio combined a wide range of expertise to create new conditions for safety and trust in machines in the workplace of the future. Over a period of three years, the nationwide research project conducted by the LIT Robopsychology Lab at Johannes Kepler University Linz in cooperation with the Ars Electronica Futurelab and five other project partners thus addressed an important yet still elusive future topic. Future forms of collaboration with cobots were therefore simulated in specially developed virtual environments in order to explore the two-way interaction between humans and machines.
From VR game to simulation with robot
For experiments in science, there are usually too few collaborative robots as well as suitable test environments available – CoBot Studio therefore went in a new direction. The VR game “Rubberduck” served as the first part of the study. Here, the test persons tried to control a virtual cobot with the help of a VR headset and a controller in order to produce rubber ducks together. In the second part of the study, the participants were asked to clean up a virtual sea of plastic waste using the same setup.
This game was then implemented for the finale of the study for the Deep Space 8K at the Ars Electronica Center in Linz, a 3D virtual simulation environment with 16-by-9-meter wall projection and the equally large floor projection. The life-like conditions there provided ideal conditions for obtaining more meaningful data. Collaboration and communication were investigated in Deep Space 8K with a virtual as well as a real mobile cobot. During the evaluation, the easily controllable and adaptable conditions of the digital test environment were a great advantage.
During their interaction with the virtual or real cobot, the participants’ reactions to its different nonverbal signals were tested. Thus, the effects of gestures on the success of the collaboration, the limits and possibilities of communication, and the evaluation of the personal experience of working with the cobot were investigated. From the data obtained and the complementary interviews, principles were derived for the development of new movement patterns and visual signals to make robots more human-friendly. In the future, such guidelines should make it easier to create more pleasant, efficient and safe working conditions. At the same time, CoBot Studio should also underline the relevance of interdisciplinary partnerships for the development of human-centered technologies and working environments of the future.
Learn more about CoBot Studio in an interview with Martina Mara, expert in robot psychology at the LIT Robopsychology Lab at JKU Linz, and Roland Haring, Director and VR expert at the Ars Electronica Futurelab:
Credits
PARTNER:
LIT Robopsychology Lab, Johannes Kepler Universität Linz
Center for Human-Computer Interaction, Universität Salzburg
Joanneum Robotics JOANNEUM RESEARCH Forschungsgesellschaft mbH
Polycular OG
Österr. Forschungsinstitut für Artificial Intelligence OFAI
Blue Danube Robotics GmbH
This project is funded by the program Ideen Lab 4.0 of the FFG.