Conversational Assistants and Gender Stereotypes: Public Perceptions and Desiderata for Voice Personas

Conversational voice assistants are rapidly developing from purely transactional systems to social companions with “personality”. UNESCO recently stated that the female and submissive personality of current digital assistants gives rise for concern as it reinforces gender stereotypes. In this work, we present results from a participatory design workshop, where we invite people to submit their preferences for a what their ideal persona might look like, both in drawings as well as in a multiple choice questionnaire. We find no clear consensus which suggests that one possible solution is to let people configure/personalise their assistants. We then outline a multi-disciplinary project of how we plan to address the complex question of gender and stereotyping in digital assistants.

[1]  Jieyu Zhao,et al.  Gender Bias in Coreference Resolution: Evaluation and Debiasing Methods , 2018, NAACL.

[2]  Selina Jeanne Sutton,et al.  Gender Ambiguous, not Genderless: Designing Gender in Voice User Interfaces (VUIs) with Sensitivity , 2020, CIU.

[3]  Anastasia Kuzminykh,et al.  Genie in the Bottle: Anthropomorphized Perceptions of Conversational Agents , 2020, CHI.

[4]  Janik Festerling,et al.  Alexa, What Are You? Exploring Primary School Children’s Ontological Perceptions of Digital Voice Assistants in Open Interactions , 2020, Human Development.

[5]  Marilyn A. Walker,et al.  Controlling Personality-Based Stylistic Variation with Neural Natural Language Generators , 2018, SIGDIAL Conference.

[6]  Adam Tauman Kalai,et al.  Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embeddings , 2016, NIPS.

[7]  M. Cannarsa Ethics Guidelines for Trustworthy AI , 2021, The Cambridge Handbook of Lawyering in the Digital Age.

[8]  Christopher D. Manning,et al.  Towards Ecologically Valid Research on Language User Interfaces , 2020, ArXiv.

[9]  Douglas Schuler,et al.  Participatory Design: Principles and Practices , 1993 .

[10]  Jason Weston,et al.  Personalizing Dialogue Agents: I have a dog, do you have pets too? , 2018, ACL.

[11]  Mark West,et al.  I'd blush if I could: closing gender divides in digital skills through education , 2019 .

[12]  Sasha Costanza-Chock Design Justice: Towards an Intersectional Feminist Framework for Design Theory and Practice , 2018, DRS2018: Catalyst.

[13]  Verena Rieser,et al.  #MeToo Alexa: How Conversational Systems Respond to Sexual Harassment , 2018, EthNLP@NAACL-HLT.

[14]  Theo Araujo,et al.  Living up to the chatbot hype: The influence of anthropomorphic design cues and communicative agency framing on conversational agent and company perceptions , 2018, Comput. Hum. Behav..

[15]  Khalil Sima'an,et al.  Wired for Speech: How Voice Activates and Advances the Human-Computer Relationship , 2006, Computational Linguistics.

[16]  f. bianchi Can You Translate that into Man? Commercial Machine Translation Systems Include Stylistic Biases , 2020 .

[17]  Verena Rieser,et al.  A Crowd-based Evaluation of Abuse Response Strategies in Conversational Agents , 2019, SIGdial.