Sensible Privacy: How We Can Protect Domestic Violence Survivors Without Facilitating Misuse

Privacy is a concept with real life ties and implications. Privacy infringement has the potential to lead to serious consequences for the stakeholders involved, hence researchers and organisations have developed various privacy enhancing techniques and tools. However, there is no solution that fits all, and there are instances where privacy solutions could be misused, for example to hide nefarious activities. Therefore, it is important to provide suitable measures and to make necessary design tradeoffs in order to avoid such misuse. This short paper aims to make a case for the need of careful consideration when designing a privacy solution, such that the design effectively addresses the user requirements while at the same time minimises the risk of inadvertently assisting potential offenders. In other words, this paper strives to promote "sensible privacy" design, which deals with the complex challenges in balancing privacy, usability and accountability. We illustrate this idea through a case study involving the design of privacy solutions for domestic violence survivors. This is the main contribution of the paper. The case study presents specific user requirements and operating conditions, which coupled with the attacker model, provide a complex yet interesting scenario to explore. One example of our solutions is described in detail to demonstrate the feasibility of our approach.

[1]  Budi Arief,et al.  Electronic Footprints in the Sand: Technologies for Assisting Domestic Violence Survivors , 2012, APF.

[2]  Danah Boyd,et al.  Social Network Sites: Definition, History, and Scholarship , 2007, J. Comput. Mediat. Commun..

[3]  Brian Neil Levine,et al.  Hordes: a Multicast-Based Protocol for Anonymity , 2002, J. Comput. Secur..

[4]  Aad van Moorsel,et al.  Digital Strategy for the Social Inclusion of Survivors of Domestic Violence , 2011 .

[5]  Ninghui Li,et al.  On the tradeoff between privacy and utility in data publishing , 2009, KDD.

[6]  H. Nissenbaum Privacy as contextual integrity , 2004 .

[7]  Catherine Crump Data Retention: Privacy, Anonymity, and Accountability Online , 2003 .

[8]  Michael K. Reiter,et al.  Crowds: anonymity for Web transactions , 1998, TSEC.

[9]  Paul F. Syverson,et al.  Hiding Routing Information , 1996, Information Hiding.

[10]  Kai Lung Hui,et al.  Online Information Privacy: Measuring the Cost-Benefit Trade-Off , 2002, ICIS.

[11]  Cynthia Fraser,et al.  Intimate Partner Violence, Technology, and Stalking , 2007, Violence against women.

[12]  Amy Bruckman,et al.  Domestic violence and information communication technologies , 2011, Interact. Comput..

[13]  Maxim Raya,et al.  On the tradeoff between trust and privacy in wireless ad hoc networks , 2010, WiSec '10.

[14]  Nick Mathewson,et al.  Tor: The Second-Generation Onion Router , 2004, USENIX Security Symposium.

[15]  Roger Dingledine,et al.  The Free Haven Project: Distributed Anonymous Storage Service , 2000, Workshop on Design Issues in Anonymity and Unobservability.

[16]  David M. Kristol,et al.  HTTP Cookies: Standards, privacy, and politics , 2001, TOIT.

[17]  Ulrich Flegel Pseudonymizing Unix Log Files , 2002, InfraSec.

[18]  Lorrie Faith Cranor,et al.  Engineering Privacy , 2009, IEEE Transactions on Software Engineering.

[19]  William Bülow,et al.  Nothing to Hide : The False Tradeoff between Privacy and Security by Daniel J. Solove , 2012 .

[20]  Paul Dourish,et al.  Unpacking "privacy" for a networked world , 2003, CHI '03.