Children of Color's Perceptions of Fairness in AI: An Exploration of Equitable and Inclusive Co-Design

When it comes to algorithmic rights and protections for children, designers will need to face new paradigms to solve problems they are targeting. The field of Design typically deals with form and function and is executed in molecules or pixels. But algorithms have neither. More importantly, algorithms may be biased in their execution against those without privileged status such as people of color, children, and the non-affluent. In this paper, we review our work on exploring perceptions of fairness in AI through co-design sessions with children of color in non-affluent neighborhoods of Baltimore City. The design sessions aimed at designing an artificially intelligent librarian for their local branch. Our preliminary findings showcase three key themes of this group's perceptions of fairness in the context of an artificially intelligent authority figure.

[1]  Joanna Bryson,et al.  Patiency Is Not a Virtue: AI and the Design of Ethical Systems , 2016, AAAI Spring Symposia.

[2]  A. Chouldechova,et al.  Toward Algorithmic Accountability in Public Services: A Qualitative Study of Affected Community Perspectives on Algorithmic Decision-making in Child Welfare Services , 2019, CHI.

[3]  William R. Frey,et al.  Artificial Intelligence and Inclusion: Formerly Gang-Involved Youth as Domain Experts for Analyzing Unstructured Twitter Data , 2018, Social science computer review.

[4]  Winifred Shaffer ANATOMY OF DESIGN , 1927 .

[5]  Allison Woodruff,et al.  A Qualitative Exploration of Perceptions of Algorithmic Fairness , 2018, CHI.

[6]  Anne Marie Piper,et al.  Deconstructing Community-Based Collaborative Design , 2019, Proc. ACM Hum. Comput. Interact..

[7]  Allison Druin,et al.  The role of children in the design of new technology , 2002 .

[8]  Greg Walsh Anatomy of a design session , 2013, INTR.

[9]  James Zou,et al.  AI can be sexist and racist — it’s time to make it fair , 2018, Nature.

[10]  Greg Walsh,et al.  Towards equity and equality in American co-design: a case study , 2018, IDC.

[11]  Anne Marie Piper,et al.  Engaging Low-Income African American Older Adults in Health Discussions through Community-based Design Workshops , 2019, CHI.

[12]  Luc Steels,et al.  The Barcelona declaration for the proper development and usage of artificial intelligence in Europe , 2018, AI Commun..

[13]  Alberto Fernández,et al.  Ethical and Legal Implications of AI Recruiting Software , 2019, ERCIM News.

[14]  Yejin Choi,et al.  The Risk of Racial Bias in Hate Speech Detection , 2019, ACL.

[15]  James A. Hendler,et al.  Designing PETS: a personal electronic teller of stories , 1999, CHI '99.

[16]  Luc Steels What Needs to Be Done to Ensure the Ethical Use of AI? , 2018, CCIA.