How to be a robot in a supermarket?
In a future where humans and robots mingle in everyday settings such as supermarkets, robots should be designed to act appropriately so that people accept their presence and know how to respond to them. For example, how can the robot signal it does not want to be disturbed when busy, or make clear it wants to pass and avoid bumping into you when moving around, or show it needs help if the situation calls for it?Â
Robots experience the world differently than humans. They can sense things humans can't. Yet there are also many things that humans perceive that robots either canât or that they perceive differently. For example, robots in supermarkets can fluently calculate inventory and tirelessly move products around. Yet, they struggle with anything unexpected, be it a broken product, a sudden change of plans, or children running through the aisles.Â
These differences between humans and robots challenges designers. What methods can we use to design robot behaviour that acknowledges these differences instead of glossing over them by approaching robots as if they are humans? How can we take these differences as a starting point for designing ways of interacting that humans can relate to and understand?
We designed a VR environment that allows humans to experience what it is like to be a robot in a supermarket. This VR environment makes it possible to step into the shoes of the robot, see the world the way the robot does, and interact with humans as if you are a robot. In this way, we investigated how to make the robot behave in ways that are in line with its limited capacities, while also being readable by humans.
We have created a novel design tool: a simulator of what it is like to be a robot in a supermarket. The interface to control the robot is designed to give a realistic feel of how a robot with limited capabilities can engage with the physical and social world. The interface that we jokingly refer to as the "straightjacket" has been created with input from roboticists to reflect the state of robotic technology now and in the next 5-10 years. Putting yourself in the shoes of the robot, with the associated restraints, was fundamental to understanding their social acceptability, for example in supermarkets, while acknowledging their technical limitations. Â
In a recent experiment, we asked a professional puppeteer to use this tool to explore future encounters with supermarket visitors played by actors on a mixed reality stage. This provided valuable insights into how interactions emerged between humans and robots, based on the encounter of two situated bodies. For example, when the robot had a head, people assumed it was capable of much more â like understanding speech â compared to if it was headless. If the robot could only move one body part at a time, it was seen more as a âthingâ and when it could move body parts in parallel it felt more like a âcreatureâ. Small delays in how quickly a robot could respond completely changed how much of a connection was felt.
Allowing the general public to enter this experience enables them to grasp how to be a robot in a supermarket, something that is hard to capture in words. It aims to develop an awareness of the current state of robot technology, creating a sense of realism about what robots are capable of, and ways they can mingle with us.
This project has been a collaboration between Delft University of Technology and Utrecht University. AIRLab Delft funded this project and the XR Zone Delft helped realise it.Â
Share
Contacts
- -Marco Rozendaal
- -Jered Vroon
- -Maaike Bleeker
- -Arend-Jan Krooneman
- -Arno Freeke
Editions
Themes
Links
Research collaboration between the Expressive Intelligence Lab and AIRlab DelftFor more on developing virtual reality for research
For more on theatre and robotics