Algorithms as fetish: Faith and possibility in algorithmic work

Algorithms are powerful because we invest in them the power to do things. With such promise, they can transform the ordinary, say snapshots along a robotic vacuum cleaner’s route, into something much more, such as a clean home. Echoing David Graeber’s revision of fetishism, we argue that this easy slip from technical capabilities to broader claims betrays not the “magic” of algorithms but rather the dynamics of their exchange. Fetishes are not indicators of false thinking, but social contracts in material form. They mediate emerging distributions of power often too nascent, too slippery or too disconcerting to directly acknowledge. Drawing primarily on 2016 ethnographic research with computer vision professionals, we show how faith in what algorithms can do shapes the social encounters and exchanges of their production. By analyzing algorithms through the lens of fetishism, we can see the social and economic investment in some people’s labor over others. We also see everyday opportunities for social creativity and change. We conclude that what is problematic about algorithms is not their fetishization but instead their stabilization into full-fledged gods and demons – the more deserving objects of critique.

[1]  Rob Kitchin,et al.  Towards Critical Data Studies: Charting and Unpacking Data Assemblages and Their Work , 2014 .

[2]  Paul Dourish,et al.  Algorithms and their others: Algorithmic culture in context , 2016, Big Data Soc..

[3]  Y. Engeström,et al.  Perspectives on activity theory: Introduction , 1999 .

[4]  B. Asher The Professional Vision , 1994 .

[5]  William Pietz,et al.  The Problem of the Fetish, I , 1985, RES: Anthropology and Aesthetics.

[6]  J. Söderberg Media Technologies - Essays on Communication, Materiality, and Society , 2014 .

[7]  Tamar Sharon,et al.  From data fetishism to quantifying selves: Self-tracking practices and the other values of data , 2017, New Media Soc..

[8]  Steven J. Jackson,et al.  Data Vision: Learning to See Through Algorithmic Abstraction , 2017, CSCW.

[9]  Karrie Karahalios,et al.  A path to understanding the effects of algorithm awareness , 2014, CHI Extended Abstracts.

[10]  Natasha D. Schüll Data for life: Wearable technology and the design of self-care , 2016, BioSocieties.

[11]  Dawn Nafus,et al.  This One Does Not Go Up to 11: The Quantified Self Movement as an Alternative Big Data Practice , 2014 .

[12]  Mary L. Gray,et al.  The Crowd is a Collaborative Network , 2016, CSCW.

[13]  Andrew D. Selbst,et al.  Big Data's Disparate Impact , 2016 .

[14]  Jenna Burrell,et al.  How the machine ‘thinks’: Understanding opacity in machine learning algorithms , 2016 .

[15]  Pablo J. Boczkowski,et al.  The Relevance of Algorithms , 2013 .

[16]  Etienne Wenger,et al.  Situated Learning: Legitimate Peripheral Participation , 1991 .

[17]  Hongfei Fan,et al.  Computer Supported Cooperative Work and Social Computing , 2018, Communications in Computer and Information Science.

[18]  Lilly Irani,et al.  Justice for Data Janitors , 2019 .

[19]  Frank A. Pasquale The Black Box Society: The Secret Algorithms That Control Money and Information , 2015 .

[20]  F. McKelvey Algorithmic Media Need Democratic Methods: Why Publics Matter , 2014 .

[21]  D. Graeber Fetishism and Social Creativity, or Fetishes are Gods in Process of Construction , 2005 .

[22]  Solon Barocas,et al.  Data & Civil Rights: Technology Primer , 2014 .

[23]  Wendy Hui Kyong Chun,et al.  On "Sourcery," or Code as Fetish , 2010 .

[24]  Lucy A. Suchman,et al.  Plans and Situated Actions: The Problem of Human-Machine Communication (Learning in Doing: Social, , 1987 .