Gender preferences for robots and gender equality orientation in communication situations

The individual physical appearances of robots are considered significant, similar to the way that those of humans are. We investigated whether users prefer robots with male or female physical appearances for use in daily communication situations and whether egalitarian gender role attitudes are rela...

Full description

Saved in:
Bibliographic Details
Published inAI & society Vol. 39; no. 2; pp. 739 - 748
Main Authors Suzuki, Tomohiro, Nomura, Tatsuya
Format Journal Article
LanguageEnglish
Published London Springer London 01.04.2024
Springer Nature B.V
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The individual physical appearances of robots are considered significant, similar to the way that those of humans are. We investigated whether users prefer robots with male or female physical appearances for use in daily communication situations and whether egalitarian gender role attitudes are related to this preference. One thousand adult men and women aged 20–60 participated in the questionnaire survey. The results of our study showed that in most situations and for most subjects, “males” was not selected and “females” or “neither” was selected. Moreover, the number of respondents who chose “either” was higher than that who chose “female.” Furthermore, we examined the relationship between gender and gender preference and confirmed that the effect of gender on the gender preference for a robot weakened when the human factor was eliminated. In addition, in some situations for android-type robots and in all situations for machine-type robots, equality orientation in gender role attitudes was shown to be higher for people who were not specific about their gender preferences. It is concluded that there is no need to introduce a robot that specifies its gender. Robots with a gender-neutral appearance might be more appropriate for applications requiring complex human–robot interaction and help avoid reproducing a gender bias.
ISSN:0951-5666
1435-5655
DOI:10.1007/s00146-022-01438-7