Toward situated interventions for algorithmic equity: lessons from the field

Research to date aimed at the fairness, accountability, and transparency of algorithmic systems has largely focused on topics such as identifying failures of current systems and on technical interventions intended to reduce bias in computational processes. Researchers have given less attention to methods that account for the social and political contexts of specific, situated technical systems at their points of use. Co-developing algorithmic accountability interventions in communities supports outcomes that are more likely to address problems in their situated context and re-center power with those most disparately affected by the harms of algorithmic systems. In this paper we report on our experiences using participatory and co-design methods for algorithmic accountability in a project called the Algorithmic Equity Toolkit. The main insights we gleaned from our experiences were: (i) many meaningful interventions toward equitable algorithmic systems are non-technical; (ii) community organizations derive the most value from localized materials as opposed to what is "scalable" beyond a particular policy context; (iii) framing harms around algorithmic bias suggests that more accurate data is the solution, at the risk of missing deeper questions about whether some technologies should be used at all. More broadly, we found that community-based methods are important inroads to addressing algorithmic harms in their situated contexts.

[1]  E. Guba,et al.  Paradigmatic Controversies, Contradictions, and Emerging Confluences. , 2005 .

[2]  S. Kidd,et al.  Practicing participatory action research , 2005 .

[3]  S. Harding The feminist standpoint theory reader : intellectual andpolitical controversies , 2004 .

[4]  Tom Jenkins,et al.  Making public things: how HCI design can express matters of concern , 2014, CHI.

[5]  Finn Kensing,et al.  Generating visions: future workshops and metaphorical design , 1992 .

[7]  Paulo Freire Pedagogy of the Oppressed: 50th Anniversary Edition , 2018 .

[8]  Christine Hauskeller,et al.  Captivating technology: race, carceral technoscience, and liberatory imagination in everyday life , 2020, Ethnic and Racial Studies.

[9]  Inioluwa Deborah Raji,et al.  Model Cards for Model Reporting , 2018, FAT.

[10]  Holtzblatt Karen,et al.  Contextual Inquiry: A Participatory Technique for System Design , 2017 .

[11]  Seeta Peña Gangadharan,et al.  Decentering technology in discourse on discrimination* , 2019, Information, Communication & Society.

[12]  P. Krafft,et al.  Defining Artificial Intelligence in Policy versus Practice , 2019 .

[13]  Stephanie Ballard,et al.  Judgment Call the Game: Using Value Sensitive Design and Design Fiction to Surface Ethical Concerns Related to Technology , 2019, Conference on Designing Interactive Systems.

[14]  Gerhard Schmidt,et al.  Out of Scandinavia: Alternative Approaches to Software Design and System Development , 1989, Hum. Comput. Interact..

[15]  Marc Rettig,et al.  Prototyping for tiny fingers , 1994, CACM.

[16]  Danah Boyd,et al.  Fairness and Abstraction in Sociotechnical Systems , 2019, FAT.

[17]  Jakob Nielsen,et al.  Heuristic Evaluation of Prototypes (individual) , 2022 .

[18]  William W. Gaver,et al.  Design: Cultural probes , 1999, INTR.

[19]  Md. Anisur Rahman,et al.  3 Some Trends in the Praxis of Participatory Action Research , 2008 .

[20]  Felix A. Fischer,et al.  Clear Sanctions, Vague Rewards: How China's Social Credit System Currently Defines "Good" and "Bad" Behavior , 2019, FAT.

[21]  Ruha Benjamin Race After Technology: Abolitionist Tools for the New Jim Code , 2019, Social Forces.

[22]  A. Hoffmann Where fairness fails: data, algorithms, and the limits of antidiscrimination discourse , 2019, Information, Communication & Society.

[23]  Solon Barocas,et al.  Engaging the ethics of data science in practice , 2017, Commun. ACM.

[24]  Sasha Costanza-Chock,et al.  Design Justice , 2020 .

[25]  Finn Kensing,et al.  Participatory Design: Issues and Concerns , 2004, Computer Supported Cooperative Work (CSCW).

[26]  S. L. Star,et al.  Social science, technical systems, and cooperative work: beyond the great divide , 1999 .

[27]  Clay Spinuzzi,et al.  The Methodology of Participatory Design , 2005 .

[28]  S. Noble Algorithms of Oppression: How Search Engines Reinforce Racism , 2018 .

[29]  Nava Tintarev,et al.  SIREN: A Simulation Framework for Understanding the Effects of Recommender Systems in Online News Environments , 2019, FAT.

[30]  Christopher A. Le Dantec,et al.  Creating a Sociotechnical API: Designing City-Scale Community Engagement , 2017, CHI.

[31]  Gillian R. Hayes The relationship of action research to human-computer interaction , 2011, TCHI.

[32]  Douglas Schuler,et al.  Participatory Design: Principles and Practices , 1993 .

[33]  R. McTaggart Principles for Participatory Action Research , 1991 .

[34]  Paul Dourish,et al.  What we talk about when we talk about context , 2004, Personal and Ubiquitous Computing.

[35]  Os Keyes,et al.  Human-Computer Insurrection: Notes on an Anarchist HCI , 2019, CHI.

[36]  Philip E. Agre,et al.  Toward a Critical Technical Practice: Lessons Learned in Trying to Reform AI , 2006 .

[37]  Feiyang Sun,et al.  Beyond Open vs. Closed: Balancing Individual Privacy and Public Accountability in Data Sharing , 2019, FAT.

[38]  Laura M. Moy A Taxonomy of Police Technology’s Racial Inequity Problems , 2019 .

[39]  Sasha Costanza-Chock Design Justice: Towards an Intersectional Feminist Framework for Design Theory and Practice , 2018, DRS2018: Catalyst.

[40]  Timnit Gebru,et al.  Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification , 2018, FAT.

[41]  Frank A. Pasquale The Black Box Society: The Secret Algorithms That Control Money and Information , 2015 .

[42]  N. Hartsock The Feminist Standpoint: Developing the Ground for a Specifically Feminist Historical Materialism , 2019, The Feminist Standpoint Revisited and Other Essays.

[43]  Batya Friedman,et al.  The envisioning cards: a toolkit for catalyzing humanistic and technical imaginations , 2012, CHI.

[44]  Batya Friedman,et al.  Toward inclusive tech policy design: a method for underrepresented voices to strengthen tech policy documents , 2019, Ethics and Information Technology.

[45]  M. Pimbert,et al.  The SAGE Handbook of Action Research , 2008 .

[46]  Shaowen Bardzell,et al.  Exploring Social Justice, Design, and HCI , 2016, CHI Extended Abstracts.

[47]  Luke Stark Facial recognition is the plutonium of AI , 2019, XRDS.

[48]  S. Milan,et al.  From data politics to the contentious politics of data , 2019, Big Data Soc..

[49]  Solon Barocas,et al.  Problem Formulation and Fairness , 2019, FAT.

[50]  Shaowen Bardzell,et al.  Feminist HCI: taking stock and outlining an agenda for design , 2010, CHI.

[51]  Pieter Jan Stappers,et al.  Co-creation and the new landscapes of design , 2008 .

[52]  Elizabeth E. Joh Policing by Numbers: Big Data and the Fourth Amendment , 2014 .

[53]  Lina Dencik,et al.  Exploring Data Justice: Conceptions, Applications and Directions , 2019, Information, Communication & Society.

[54]  Dharma Dailey,et al.  An Algorithmic Equity Toolkit for Technology Audits by Community Advocates and Activists , 2019, ArXiv.

[55]  Donna Harawy Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective , 2022, Philosophical Literary Journal Logos.

[56]  Batya Friedman,et al.  Metaphor Cards: A How-to-Guide for Making and Using a Generative Metaphorical Design Toolkit , 2018, Conference on Designing Interactive Systems.

[57]  Lucy A. Suchman,et al.  Plans and Situated Actions: The Problem of Human-Machine Communication (Learning in Doing: Social, , 1987 .

[58]  Jill Palzkill Woelfer,et al.  A value sensitive action-reflection model: evolving a co-design space with stakeholder and designer prompts , 2013, CHI.

[59]  Ben Green,et al.  Disparate Interactions: An Algorithm-in-the-Loop Analysis of Fairness in Risk Assessments , 2019, FAT.

[60]  StarkLuke Facial recognition is the plutonium of AI , 2019 .

[61]  Brian W. Powers,et al.  Dissecting racial bias in an algorithm used to manage the health of populations , 2019, Science.

[62]  Abolfazl Asudeh,et al.  A Nutritional Label for Rankings , 2018, SIGMOD Conference.