Towards Effective Differential Privacy Communication for Users’ Data Sharing Decision and Comprehension

Differential privacy protects an individual’s privacy by perturbing data on an aggregated level (DP) or individual level (LDP). We report four online human-subject experiments investigating the effects of using different approaches to communicate differential privacy techniques to laypersons in a health app data collection setting. Experiments 1 and 2 investigated participants’ data disclosure decisions for low-sensitive and high-sensitive personal information when given different DP or LDP descriptions. Experiments 3 and 4 uncovered reasons behind participants’ data sharing decisions, and examined participants’ subjective and objective comprehensions of these DP or LDP descriptions. When shown descriptions that explain the implications instead of the definition/processes of DP or LDP technique, participants demonstrated better comprehension and showed more willingness to share information with LDP than with DP, indicating their understanding of LDP’s stronger privacy guarantee compared with DP.

[1]  David J. Hauser,et al.  Attentive Turkers: MTurk participants perform better on online attention checks than do subject pool participants , 2015, Behavior Research Methods.

[2]  Daniel R. Horne,et al.  The Privacy Paradox: Personal Information Disclosure Intentions versus Behaviors , 2007 .

[3]  Dawn Xiaodong Song,et al.  Chorus: Differential Privacy via Query Rewriting , 2018, ArXiv.

[4]  P. Krebs,et al.  Health App Use Among US Mobile Phone Owners: A National Survey , 2015, JMIR mHealth and uHealth.

[5]  V. Braun,et al.  Using thematic analysis in psychology , 2006 .

[6]  Anil Kumar Understanding Privacy , 2010 .

[7]  John Torous,et al.  Clinical review of user engagement with mental health smartphone apps: evidence, theory and improvements , 2018, Evidence Based Journals.

[8]  Alessandro Acquisti,et al.  Sleights of privacy: framing, disclosures, and the limits of transparency , 2013, SOUPS.

[9]  J. Reeve,et al.  Solutions to problematic polypharmacy: learning from the expertise of patients. , 2015, The British journal of general practice : the journal of the Royal College of General Practitioners.

[10]  A. Tversky,et al.  The framing of decisions and the psychology of choice. , 1981, Science.

[11]  Lujo Bauer,et al.  Privacy Expectations and Preferences in an IoT World , 2017, SOUPS.

[12]  Tobias Dehling,et al.  Exploring the Far Side of Mobile Health: Information Security and Privacy of Mobile Health Apps on iOS and Android , 2015, JMIR mHealth and uHealth.

[13]  Robert M. Entman,et al.  Coverage of international news: Contrasts in narratives of the KAL and Iran Air incidents , 1991 .

[14]  Josip Car,et al.  Unaddressed privacy risks in accredited health and wellness apps: a cross-sectional systematic assessment , 2015, BMC Medicine.

[15]  David A. Wagner,et al.  Android permissions: user attention, comprehension, and behavior , 2012, SOUPS.

[16]  Lorrie Faith Cranor,et al.  Standardizing privacy notices: an online study of the nutrition label approach , 2010, CHI.

[17]  Krishnaram Kenthapadi,et al.  PriPeARL: A Framework for Privacy-Preserving Analytics and Reporting at LinkedIn , 2018, CIKM.

[18]  Helen Nissenbaum,et al.  Privacy in Context - Technology, Policy, and the Integrity of Social Life , 2009 .

[19]  Yunan Chen,et al.  Privacy management in dynamic groups: understanding information privacy in medical practices , 2013, CSCW.

[20]  Aleecia M. McDonald,et al.  The Cost of Reading Privacy Policies , 2009 .

[21]  Janardhan Kulkarni,et al.  Collecting Telemetry Data Privately , 2017, NIPS.

[22]  Cynthia Dwork,et al.  Differential Privacy , 2006, ICALP.

[23]  D. Altman,et al.  Multiple significance tests: the Bonferroni method , 1995, BMJ.

[24]  Lorrie Faith Cranor,et al.  Designing Effective Privacy Notices and Controls , 2017, IEEE Internet Computing.

[25]  Lorrie Faith Cranor,et al.  How Short Is Too Short? Implications of Length and Framing on the Effectiveness of Privacy Notices , 2016, SOUPS.

[26]  A. Tversky,et al.  Rational choice and the framing of decisions , 1990 .

[27]  Alessandro Acquisti,et al.  The Effect of Online Privacy Information on Purchasing Behavior: An Experimental Study , 2011, WEIS.

[28]  Evan M. Peck,et al.  Towards Understanding Differential Privacy: When Do People Trust Randomized Response Technique? , 2017, CHI.

[29]  Cynthia Dwork,et al.  Calibrating Noise to Sensitivity in Private Data Analysis , 2006, TCC.

[30]  Laura A. Dabbish,et al.  Privacy Attitudes of Mechanical Turk Workers and the U.S. Public , 2014, SOUPS.

[31]  Ninghui Li,et al.  Effective Risk Communication for Android Apps , 2013, IEEE Transactions on Dependable and Secure Computing.

[32]  Úlfar Erlingsson,et al.  RAPPOR: Randomized Aggregatable Privacy-Preserving Ordinal Response , 2014, CCS.

[33]  Elaine Shi,et al.  Optimal Lower Bound for Differentially Private Multi-party Aggregation , 2012, ESA.

[34]  S L Warner,et al.  Randomized response: a survey technique for eliminating evasive answer bias. , 1965, Journal of the American Statistical Association.

[35]  Lujo Bauer,et al.  (Do Not) Track Me Sometimes: Users’ Contextual Preferences for Web Tracking , 2016, Proc. Priv. Enhancing Technol..

[36]  Martin Ortlieb,et al.  "If You Put All The Pieces Together...": Attitudes Towards Data Combination and Sharing Across Services and Companies , 2016, CHI.

[37]  Lorrie Faith Cranor,et al.  A "nutrition label" for privacy , 2009, SOUPS.

[38]  Dan Cosley,et al.  Privacy Lies: Understanding How, When, and Why People Lie to Protect Their Privacy in Multiple Online Contexts , 2018, CHI.

[39]  Alessandro Acquisti,et al.  Privacy and rationality in individual decision making , 2005, IEEE Security & Privacy.

[40]  Norman M. Sadeh,et al.  Expectation and purpose: understanding users' mental models of mobile app privacy through crowdsourcing , 2012, UbiComp.

[41]  A. Tversky,et al.  Prospect theory: an analysis of decision under risk — Source link , 2007 .