Crossing Realities for Mutual Understanding in Human-Robot Teams.
- Project Period: 06/2019 - 11/2022
- Funders: Austrian Research Promotion Agency (FFG)
- Project Partners: LIT Robopsychology Lab (Lead), Joanneum Robotics, Blue Danube Robotics, Polycular, Ars Electronica Futurelab, Center for HCI of University of Salzburg, Austrian Research Institute for Artificial Intelligence (OFAI)
In the coming years, more and more workplaces will be equipped with collaborative robots (CoBots) that hold workpieces for employees, assemble car seats together with them, inspect packages or work physically close to people in other ways.
Understanding the states and next steps of such a robot is not only important for the successful completion of joint tasks, but also plays a major role for the acceptance by the human interaction partners. After all, safety and trust are basic human needs built on mutual understanding. For human-robot collaboration, this means that just as the states and intentions of the human partner must be identifiable for the robot, so, conversely, the states and planned actions of the robot should also be understandable and predictable for the human being.
This is where the interdisciplinary consortium of the CoBot Studio comes in with new, creative methods. The project focuses on the development of an immersive extended reality simulation in which communicative collaboration processes with mobile robots can be playfully tried out and studied under controllable conditions. With access to the technical infrastructure available in "Deep Space 8K", a physical setting for virtual reality at the Ars Electronica Center in Linz, as well as other VR and AR technologies, persons take part in collaborative games with robots in which tasks such as assembling or organizing small objects have to be completed together. During the games, non-verbal communication signals of the robot are varied and their impact on collaboration success, comprehensibility and the interaction experience as a whole are evaluated.
Based on the findings of collaborative games and supplementary quantitative and qualitative interviews, practice-oriented design principles for motion sequences and new visual signal generators for collaborative robots are derived. The CoBot Studio thus combines physical and virtual levels of perception into a flexible research environment.