« Back to Publications list

Face to Face with a Sexist Robot: Investigating how women react to sexist robot behaviors

Social robots are often created with gender in mind, for example by giving them a designed gender identity or including elements of gender in their behaviors. However, even if unintentional, such social robot designs may have strong gender biases, stereotypes or even sexist ideas embedded into them. Between people, we know that exposure to even mild or veiled sexism can have negative impacts on women. However, we do not yet know how such behaviors will be received when they come from a robot. If a robot only offers to help women (and not men) lift objects for example, thus suggesting that women are weaker than men, will women see it as sexist, or just dismiss it as a machine error? In this paper we engage with this question by studying how women respond to a robot that demonstrates a range of sexist behaviors. Our results indicate that not only do women have negative reactions to sexist behaviors from a robot, but that the male-typical work tasks common to robots (i.e., factory work, using machinery, and lifting) are enough for stereotype activation and for women to exhibit signs of stress. Particularly given the male dominated demographic of computer science and engineering and the emerging understanding of algorithmic bias in machine learning and AI, our work highlights the potential for negative impacts on women who interact with social robots.

https://doi.org/10.1007/s12369-023-01001-4

Garcha, D., Geiskkovitch, D., Thiessen, R. et al. Face to Face with a Sexist Robot: Investigating How Women React to Sexist Robot Behaviors. Int J of Soc Robotics (2023). https://doi.org/10.1007/s12369-023-01001-4

Authors

James E.Young

James E.Young

Professor

As well as: Susan Prentice and Kerstin Fischer