Gender Bias in Chatbot Design

A recent UNESCO report reveals that most popular voice-based conversational agents are designed to be female. In addition, it outlines the potentially harmful effects this can have on society. However, the report focuses primarily on voice-based conversational agents and the analysis did not include chatbots (i.e., text-based conversational agents). Since chatbots can also be gendered in their design, we used an automated gender analysis approach to investigate three gender-specific cues in the design of 1,375 chatbots listed on the platform chatbots.org. We leveraged two gender APIs to identify the gender of the name, a face recognition API to identify the gender of the avatar, and a text mining approach to analyze gender-specific pronouns in the chatbot’s description. Our results suggest that gender-specific cues are commonly used in the design of chatbots and that most chatbots are – explicitly or implicitly – designed to convey a specific gender. More specifically, most of the chatbots have female names, female-looking avatars, and are described as female chatbots. This is particularly evident in three application domains (i.e., branded conversations, customer service, and sales). Therefore, we find evidence that there is a tendency to prefer one gender (i.e., female) over another (i.e., male). Thus, we argue that there is a gender bias in the design of chatbots in the wild. Based on these findings, we formulate propositions as a starting point for future discussions and research to mitigate the gender bias in the design of chatbots.

[1]  Yugo Hayashi Lexical Network Analysis on an Online Explanation Task: Effects of Affect and Embodiment of a Pedagogical Agent , 2016, IEICE Trans. Inf. Syst..

[2]  Clifford Nass,et al.  Computers are social actors , 1994, CHI '94.

[3]  Anton Nijholt,et al.  How the agent's gender influence users' evaluation of a QA system , 2010, 2010 International Conference on User Science and Engineering (i-USEr).

[4]  Manfred Tscheligi,et al.  SIG: Chatbots for Social Good , 2018, CHI Extended Abstracts.

[5]  Timothy W. Bickmore,et al.  Establishing and maintaining long-term human-computer relationships , 2005, TCHI.

[6]  Ronald E. Anderson ACM code of ethics and professional conduct , 1992, CACM.

[7]  Asbjørn Følstad,et al.  Different Chatbots for Different Purposes: Towards a Typology of Chatbots to Understand Interaction Design , 2018, INSCI Workshops.

[8]  Andrew J. Cowell,et al.  Manipulation of non-verbal interaction style and demographic embodiment to increase anthropomorphic computer character credibility , 2005, Int. J. Hum. Comput. Stud..

[9]  Asbjørn Følstad,et al.  Chatbots: changing user needs and motivations , 2018, Interactions.

[10]  Michael F. McTear,et al.  The Rise of the Conversational Interface: A New Kid on the Block? , 2016, FETLT.

[11]  Ardion Beldad,et al.  The effect of virtual sales agent (VSA) gender - product gender congruence on product advice credibility, trust in VSA and online vendor, and purchase intention , 2016, Comput. Hum. Behav..

[12]  Margaret M. Burnett,et al.  GenderMag: A Method for Evaluating Software's Gender Inclusiveness , 2016, Interact. Comput..

[13]  Mark West,et al.  I'd blush if I could: closing gender divides in digital skills through education , 2019 .

[14]  Heather H. Mitchell,et al.  Social Cues in Animated Conversational Agents , 2005 .

[15]  Florian Johannsen,et al.  Comparison of Commercial Chatbot solutions for Supporting Customer Interaction , 2018, ECIS.

[16]  Ana Paiva,et al.  Providing Gender to Embodied Conversational Agents , 2011, International Conference on Intelligent Virtual Agents.

[17]  Asbjørn Følstad,et al.  Chatbots and the new world of HCI , 2017, Interactions.

[18]  Jay F. Nunamaker,et al.  Embodied Conversational Agent-Based Kiosk for Automated Interviewing , 2011, J. Manag. Inf. Syst..

[19]  Alexander Maedche,et al.  Leveraging Machine-Executable Descriptive Knowledge in Design Science Research - The Case of Designing Socially-Adaptive Chatbots , 2019, DESRIST.

[20]  C. Nass,et al.  Machines and Mindlessness , 2000 .

[21]  Kate S. Hone,et al.  Empathic agents to reduce user frustration: The effects of varying agent characteristics , 2006, Interact. Comput..

[22]  Alexander Maedche,et al.  A Taxonomy of Social Cues for Conversational Agents , 2019, Int. J. Hum. Comput. Stud..

[23]  David Baxter,et al.  Chatbots and Gender Stereotyping , 2019, Interact. Comput..

[24]  Theo Araujo,et al.  Living up to the chatbot hype: The influence of anthropomorphic design cues and communicative agency framing on conversational agent and company perceptions , 2018, Comput. Hum. Behav..

[25]  Nicole C. Krämer,et al.  Closing the gender gap in STEM with friendly male instructors? On the effects of rapport behavior and gender of a virtual agent in an instructional interaction , 2016, Comput. Educ..

[26]  Michael D. Myers,et al.  A set of ethical principles for design science research in information systems , 2014, Inf. Manag..

[27]  Antonella De Angeli,et al.  Gender affordances of conversational agents , 2012, Interacting with computers.

[28]  Robert Dale,et al.  The return of the chatbots , 2016, Natural Language Engineering.

[29]  C. Nass,et al.  Are Machines Gender Neutral? Gender‐Stereotypic Responses to Computers With Voices , 1997 .

[30]  Tibert Verhagen,et al.  Virtual Customer Service Agents: Using Social Presence and Personalization to Shape Online Service Encounters , 2014, J. Comput. Mediat. Commun..

[31]  Alexander Maedche,et al.  Towards Designing Cooperative and Social Conversational Agents for Customer Service , 2017, ICIS.

[32]  Joseph Weizenbaum,et al.  and Machine , 1977 .

[33]  Alexander Maedche,et al.  Designing a Chatbot Social Cue Configuration System , 2019, ICIS.