Currently, embodied AI (e.g., smart objects, robots, smart personal assistants) is an expression of power: it can be used to support humans through human-agent relationships, but it can also increase existing societal bias and amplify injustice. Recently, we have seen a growing number of calls for considering how issues like gender, race, and disability play a role in AI technology. The DEI4 Embodied AI initiative focused on broadening participation in the development of AI in computer science (i.e., who takes decisions about what to develop, how to develop, who takes a particular perspective, and which values to embed) and on changing current practices in an open societal conversation.
We developed four transdisciplinary tools to conduct futuring and critical design workshops with academics, designers, and participants from society.
We tested the tools in 4 international workshops with more than 200 participants.
The tools are:
- Reflect on implicit assumptions: The tool offers a two-step activity for tangible reflections related to how we design embodied AI and how we imagine possible, probable, and desirable futures
- Mapping privileges. This tool is designed to let people reflect on personal positions of privilege. Our tool invites participants to position themselves into binary axes generally associated with privilege, i.e., skin colour.
- Punkbot collages against the status quo. The tool supports the overturning of the status quo of robot design using ornamental activities inspired by punk techniques by Letterist International.
Exploring spaces between categories: A biased classifier. This tool aims to break with stereotypical expectations and thinking in binary categories. By surfacing our unconscious associations and the narrow ways of categorizing things, a classification algorithm (i.e.,Teachable Machine) can be used to help us challenge gender norms and stereotypes.
The project focuses on entanglements related to reflecting on AI with and for society. It is relevant because traditionally, designers and engineers are not trained to include reflection and practices that tackle social inequity; thus —willingly or unwillingly—encode certain (negative) values into the design systems.
We offer practical tools, insights, and a community to design for justice.