Algorithmic Injustices: Towards a Relational Ethics

It has become trivial to point out how decision-making processes in various social, political and economical sphere are assisted by automated systems. Improved efficiency, the hallmark of these systems, drives the mass scale integration of automated systems into daily life. However, as a robust body of research in the area of algorithmic injustice shows, algorithmic tools embed and perpetuate societal and historical biases and injustice. In particular, a persistent recurring trend within the literature indicates that society's most vulnerable are disproportionally impacted. When algorithmic injustice and bias is brought to the fore, most of the solutions on offer 1) revolve around technical solutions and 2) do not focus centre disproportionally impacted groups. This paper zooms out and draws the bigger picture. It 1) argues that concerns surrounding algorithmic decision making and algorithmic injustice require fundamental rethinking above and beyond technical solutions, and 2) outlines a way forward in a manner that centres vulnerable groups through the lens of relational ethics.

[1]  P. Verbeek Morality in Design: Design Ethics and the Morality of Technological Artifacts , 2008 .

[2]  M. Holquist Dialogism: Bakhtin and His World , 1990 .

[3]  K. Crawford,et al.  Dirty Data, Bad Predictions: How Civil Rights Violations Impact Police Data, Predictive Policing Systems, and Justice , 2019 .

[4]  Christopher Frauenberger,et al.  Diversity computing , 2018, Interactions.

[5]  Shoshana Zuboff The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power , 2019 .

[6]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[7]  Catherine E. Tucker,et al.  Algorithmic Bias? An Empirical Study of Apparent Gender-Based Discrimination in the Display of STEM Career Ads , 2019, Manag. Sci..

[8]  Sorelle A. Friedler,et al.  Hiring by Algorithm: Predicting and Preventing Disparate Impact , 2016 .

[9]  Arvind Narayanan,et al.  Semantics derived automatically from language corpora contain human-like biases , 2016, Science.

[10]  A. Young Sorting Things Out: Classification and Its Consequences. , 2001 .

[11]  Kadija Ferryman,et al.  Fairness in precision medicine , 2018 .

[12]  Ruha Benjamin Race After Technology: Abolitionist Tools for the New Jim Code , 2019, Social Forces.

[13]  Daniel McQuillan,et al.  Data Science as Machinic Neoplatonism , 2017, Philosophy & Technology.

[14]  Apurv Jain Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy , 2017, Business Economics.

[15]  Catherine Tucker,et al.  Algorithmic bias? An empirical study into apparent gender-based discrimination in the display of STEM career ads , 2019 .

[16]  Helen Nissenbaum,et al.  Shaping the Web: Why the Politics of Search Engines Matters , 2000, Inf. Soc..

[17]  Yoav Goldberg,et al.  Lipstick on a Pig: Debiasing Methods Cover up Systematic Gender Biases in Word Embeddings But do not Remove Them , 2019, NAACL-HLT.

[18]  E. D. Paolo,et al.  Linguistic Bodies: The Continuity between Life and Language , 2018 .

[19]  Sophia Melanson,et al.  We are data: algorithms and the making of our digital selves , 2017 .

[20]  Mike Ananny,et al.  Seeing without knowing: Limitations of the transparency ideal and its application to algorithmic accountability , 2018, New Media Soc..

[21]  Judy Hoffman,et al.  Predictive Inequity in Object Detection , 2019, ArXiv.