“Strongly Recommended” Revisiting Decisional Privacy to Judge Hypernudging in Self-Tracking Technologies

This paper explores and rehabilitates the value of decisional privacy as a conceptual tool, complementary to informational privacy, for critiquing personalized choice architectures employed by self-tracking technologies. Self-tracking technologies are promoted and used as a means to self-improvement. Based on large aggregates of personal data and the data of other users, self-tracking technologies offer personalized feedback that nudges the user into behavioral change. The real-time personalization of choice architectures requires continuous surveillance and is a very powerful technology, recently coined as “hypernudging.” While users celebrate the increased personalization of their coaching devices, “hypernudging” technologies raise concerns about manipulation. This paper addresses that intuition by claiming that decisional privacy is at stake. It thus counters the trend to solely focus on informational privacy when evaluating information and communication technologies. It proposes that decisional privacy and informational privacy are often part of a mutually reinforcing dynamic. Hypernudging is used as a key example to illustrate that the two dimensions should not be treated separately. Hypernudging self-tracking technologies compromise autonomy because they violate informational and decisional privacy. In order to effectively judge whether technologies that use hypernudges empower users, we need both privacy dimensions as conceptual tools.

[1]  Blake J. Roessler,et al.  The Value of Privacy , 2004 .

[2]  Jean Louise Cohen Regulating Intimacy: A New Legal Paradigm , 2002 .

[3]  J. Prochaska,et al.  A meta-analysis of computer-tailored interventions for health behavior change. , 2010, Preventive medicine.

[4]  T. Wilkinson,et al.  Nudging and Manipulation , 2013 .

[5]  Bart Engelen,et al.  Judging Nudging: Answering the Manipulation Objection , 2017 .

[6]  Jonathan L. Zittrain Engineering an Election , 2014 .

[7]  Karen Yeung,et al.  ‘Hypernudge’: Big Data as a mode of regulation by design , 2016, The Social Power of Algorithms.

[8]  Manfred Tscheligi,et al.  Persuasive Technology , 2016, Lecture Notes in Computer Science.

[9]  John Leubsdorf,et al.  Privacy and Freedom , 1968 .

[10]  Helen Nissenbaum,et al.  Privacy in Context - Technology, Policy, and the Integrity of Social Life , 2009 .

[11]  Eli Pariser,et al.  The Filter Bubble: What the Internet Is Hiding from You , 2011 .

[12]  J. W. DeCew,et al.  Uneasy Access: Privacy for Women in a Free Society , 1988 .

[13]  Deirdre K. Mulligan,et al.  The Federal Trade Commission and Consumer Privacy in the Coming Decade , 2007 .

[14]  Lei Ying,et al.  The Value of Privacy: Strategic Data Subjects, Incentive Mechanisms and Fundamental Limits , 2016, SIGMETRICS.

[15]  Marjolein Lanzing,et al.  The transparent self , 2016, Ethics and Information Technology.

[16]  Deborah Lupton,et al.  Quantified Self , 2016, Encyclopedia of Social Network Analysis and Mining. 2nd Ed..

[17]  Luciano Floridi,et al.  Why a Right to Explanation of Automated Decision-Making Does Not Exist in the General Data Protection Regulation , 2017 .

[18]  Samuel Kerstein Treating Others Merely as Means , 2009, Utilitas.

[19]  Deborah Lupton,et al.  Self-Tracking Modes: Reflexive Self-Monitoring and Data Practices , 2014 .

[20]  Bryce Clayton Newell,et al.  A typology of privacy , 2016 .

[21]  F. Barnard,et al.  Manipulatory Politics , 1983 .

[22]  Gordon Graham,et al.  The Morality of Freedom. , 1987 .

[23]  S. I. Benn,et al.  Privacy, Freedom, and Respect for Persons , 2017 .

[24]  Jenna Burrell,et al.  How the machine ‘thinks’: Understanding opacity in machine learning algorithms , 2016 .

[25]  A. Taegtmeyer Personalized Medicine , 2007, McGill journal of medicine : MJM : an international forum for the advancement of medical sciences by students.

[26]  B. Prainsack Personalized Medicine: Empowered Patients in the 21st Century? , 2017 .

[27]  Cass R. Sunstein,et al.  Libertarian Paternalism Is Not an Oxymoron , 2003 .

[28]  R. Gavison Privacy and the Limits of Law , 1980 .

[29]  Cathy O'Neil,et al.  Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy , 2016, Vikalpa: The Journal for Decision Makers.

[30]  Annemiek J. Linn,et al.  Taking online computer-tailoring forward: The potential of tailoring the message frame and delivery mode of online health behaviour change interventions , 2015 .

[31]  John Danaher,et al.  The Threat of Algocracy: Reality, Resistance and Accommodation , 2016, Philosophy & Technology.

[32]  Damian Trilling,et al.  Should We Worry About Filter Bubbles? , 2016 .

[33]  Luciano Floridi,et al.  The Ethics of Big Data: Current and Foreseeable Issues in Biomedical Contexts , 2015, Science and Engineering Ethics.

[34]  T. Monahan Dreams of Control at a Distance: Gender, Surveillance, and Social Control , 2009 .

[35]  Shoshana Zuboff,et al.  Big other: surveillance capitalism and the prospects of an information civilization , 2015, J. Inf. Technol..

[36]  Chris Peters,et al.  The daily you: How the new advertising industry is defining your identity and your worth , 2012, New Media Soc..

[37]  Joseph Gray Jackson,et al.  Privacy and Freedom , 1968 .

[38]  Mireille Hildebrandt,et al.  Defining Profiling: A New Type of Knowledge? , 2008, Profiling the European Citizen.

[39]  Bart van der Sloot Privacy as Virtue: Moving Beyond the Individual in the Age of Big Data , 2017 .

[40]  Lucy Yardley,et al.  Developing and Evaluating Digital Interventions to Promote Behavior Change in Health and Health Care: Recommendations Resulting From an International Workshop , 2017, Journal of medical Internet research.

[41]  D. Hausman,et al.  Debate: To Nudge or Not to Nudge* , 2010 .

[42]  R. Thaler,et al.  Nudge: Improving Decisions About Health, Wealth, and Happiness , 2008 .

[43]  Thomas Poell,et al.  Understanding the promises and premises of online health platforms , 2016, Big Data Soc..

[44]  B. Koops,et al.  Bentham, Deleuze and Beyond: An Overview of Surveillance Theories from the Panopticon to Participation , 2017 .

[45]  Frank A. Pasquale The Black Box Society: The Secret Algorithms That Control Money and Information , 2015 .

[46]  David J. Phillips,et al.  Surveillance Studies needs Gender and Sexuality , 2009 .

[47]  B. J. Fogg,et al.  Persuasive technology: using computers to change what we think and do , 2002, UBIQ.

[48]  H. Patterson Contextual Expectations of Privacy in Self-Generated Health Information Flows , 2013 .

[49]  John Owens,et al.  ‘My Fitbit Thinks I Can Do Better!’ Do Health Promoting Wearable Technologies Support Personal Autonomy? , 2019 .

[50]  Philip Brey,et al.  Freedom and Privacy in Ambient Intelligence , 2005, Ethics and Information Technology.

[51]  Frank A. Pasquale,et al.  [89WashLRev0001] The Scored Society: Due Process for Automated Predictions , 2014 .