A Human-Centric Perspective on Digital Consenting: The Case of GAFAM

According to different legal frameworks such as the European General Data Protection Regulation (GDPR), an end-user’s consent constitutes one of the well-known legal bases for personal data processing. However, research has indicated that the majority of end-users have difficulty making sense of what they are consenting to in the digital world. Moreover, it was demonstrated that marginalized people are confronted with even more difficulties dealing with their own digital privacy. In this paper, using an enactivist perspective in cognitive science, we develop a basic human-centric framework regarding digital consent. We argue the action of consenting is a sociocognitive action and includes cognitive, collective, and contextual aspects. Based on this theoretical framework, we present our qualitative evaluation of the practice of gaining consent conducted by the five big tech companies, i.e. Google, Amazon, Facebook, Apple, and Microsoft (GAFAM). The evaluation shows that these companies are lacking in their efforts to empower end-users by considering the human-centric aspects of the action of consenting. We use this approach to argue that the consent gaining mechanisms violate principles of fairness, accountability and transparency and suggest that our approach might even raise doubts regarding the lawfulness of the acquired consent–particularly considering the basic requirements of lawful consent within the legal framework of the GDPR.

[1]  Surfing Uncertainty: Prediction, Action and the Embodied Mind, by Andy Clark , 2017 .

[2]  Vadim Savenkov,et al.  An Enactive Theory of Need Satisfaction , 2017, PT-AI.

[3]  Jonathan Mayer,et al.  Dark Patterns at Scale , 2019, Proc. ACM Hum. Comput. Interact..

[4]  Mark S. Granovetter Economic Action and Social Structure: The Problem of Embeddedness , 1985, American Journal of Sociology.

[5]  Gustaf Neumann,et al.  End-user Empowerment: An Interdisciplinary Perspective , 2020, HICSS.

[6]  Jules Polonetsky,et al.  A Theory of Creepy: Technology, Privacy and Shifting Social Norms , 2013 .

[7]  Sebastian Boring,et al.  Dark patterns in proxemic interactions: a critical perspective , 2014, Conference on Designing Interactive Systems.

[8]  A. Clark Whatever next? Predictive brains, situated agents, and the future of cognitive science. , 2013, The Behavioral and brain sciences.

[9]  Soheil HUMAN,et al.  [How] Can Pluralist Approaches to Computational Cognitive Modeling of Human Needs and Values Save our Democracies? , 2019, Intellectica. Revue de l'Association pour la Recherche Cognitive.

[10]  D. Lyon Surveillance capitalism, surveillance culture and data politics 1 , 2019, Data Politics.

[11]  Anne Oeldorf-Hirsch,et al.  The Biggest Lie on the Internet: Ignoring the Privacy Policies and Terms of Service Policies of Social Networking Services , 2020 .

[12]  Michael Chromik,et al.  Dark Patterns of Explainability, Transparency, and User Control for Intelligent Systems , 2019, IUI Workshops.

[13]  Ponnurangam Kumaraguru,et al.  Privacy Indexes: A Survey of Westin's Studies , 2005 .

[14]  Svenja Polst,et al.  Why Users Ignore Privacy Policies - A Survey and Intention Model for Explaining User Privacy Behavior , 2018, HCI.

[15]  danah boyd,et al.  Privacy at the Margins| Understanding Privacy at the Margins—Introduction , 2018 .

[16]  Deborah Lupton Personal Data Practices in the Age of Lively Data , 2015 .

[17]  Yki Kortesniemi,et al.  Can the obstacles to privacy self-management be overcome? Exploring the consent intermediary approach , 2017 .

[18]  Paul Dourish,et al.  Implications for design , 2006, CHI.

[19]  Christopher P. Puto,et al.  Adding Asymmetrically Dominated Alternatives: Violations of Regularity & the Similarity Hypothesis. , 1981 .

[20]  Lujo Bauer,et al.  The Influence of Friends and Experts on Privacy Decision Making in IoT Scenarios , 2018, Proc. ACM Hum. Comput. Interact..

[21]  Daan Kolkman,et al.  Transparent to whom? No algorithmic accountability without a critical audience , 2018, Information, Communication & Society.

[22]  Lois Ann Scheidt,et al.  It’s Complicated: The Social Lives of Networked Teens , 2015, New Media Soc..

[23]  J. Fox,et al.  The uncertain relationship between transparency and accountability , 2007 .

[24]  Marilyn Hughes Blackmon,et al.  Cognitive walkthrough for the web , 2002, CHI.

[25]  Alessandro Acquisti,et al.  Privacy in electronic commerce and the economics of immediate gratification , 2004, EC '04.

[26]  Tracy Brown,et al.  The Embodied Mind: Cognitive Science and Human Experience , 2002, Cybern. Hum. Knowing.

[27]  M. Nour Surfing Uncertainty: Prediction, Action, and the Embodied Mind. , 2017, British Journal of Psychiatry.

[28]  J. Dijck,et al.  The Platform Society: Public Values in a Connective World , 2018 .

[29]  Alice E. Marwick,et al.  Social Privacy in Networked Publics: Teens’ Attitudes, Practices, and Strategies , 2011 .

[30]  Karl J. Friston,et al.  From cognitivism to autopoiesis: towards a computational framework for the embodied mind , 2016, Synthese.

[31]  Alice E. Marwick,et al.  “Nobody Sees It, Nobody Gets Mad”: Social Media, Privacy, and Personal Responsibility Among Low-SES Youth , 2017 .

[32]  Frank Kargl,et al.  Tales from the Dark Side: Privacy Dark Strategies and Privacy Dark Patterns , 2016, Proc. Priv. Enhancing Technol..

[33]  Pavel Klinov,et al.  Knowledge Engineering and Semantic Web , 2015, Communications in Computer and Information Science.

[34]  Danah Boyd,et al.  Networked privacy: How teenagers negotiate context in social media , 2014, New Media Soc..

[35]  Jenifer Tidwell Designing Interfaces , 2005 .

[36]  Gustaf Neumann,et al.  Introduction to the Minitrack on End-user Empowerment in the Digital Age , 2020, HICSS.

[37]  Francisco J. Varela,et al.  The Embodied Mind , 2018 .

[38]  Marilyn Hughes Blackmon,et al.  Automated Cognitive Walkthrough for the Web (AutoCWW) , 2002 .

[39]  Laura A. Dabbish,et al.  The Role of Social Influence in Security Feature Adoption , 2015, CSCW.

[40]  Colin M. Gray,et al.  The Dark (Patterns) Side of UX Design , 2018, CHI.

[41]  Shoshana Zuboff The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power , 2019 .

[42]  Robert J. Anderson,et al.  Representations and Requirements: The Value of Ethnography in System Design , 1994, Hum. Comput. Interact..

[43]  Vadim Savenkov,et al.  Ontology for Representing Human Needs , 2017, KESW.

[44]  B. Roessler,et al.  Social dimensions of privacy : interdisciplinary perspectives , 2015 .

[45]  Jenifer Tidwell,et al.  Designing interfaces - patterns for effective interaction design , 2019 .

[46]  Vadim Savenkov,et al.  Supporting Pluralism by Artificial Intelligence: Conceptualizing Epistemic Disagreements as Digital Artifacts , 2017, PT-AI.