A Scanner Darkly: Protecting User Privacy from Perceptual Applications

Perceptual, "context-aware" applications that observe their environment and interact with users via cameras and other sensors are becoming ubiquitous on personal computers, mobile phones, gaming platforms, household robots, and augmented-reality devices. This raises new privacy risks. We describe the design and implementation of DARKLY, a practical privacy protection system for the increasingly common scenario where an untrusted, third-party perceptual application is running on a trusted device. DARKLY is integrated with OpenCV, a popular computer vision library used by such applications to access visual inputs. It deploys multiple privacy protection mechanisms, including access control, algorithmic privacy transforms, and user audit. We evaluate DARKLY on 20 perceptual applications that perform diverse tasks such as image recognition, object tracking, security surveillance, and face detection. These applications run on DARKLY unmodified or with very few modifications and minimal performance overheads vs. native OpenCV. In most cases, privacy enforcement does not reduce the applications' functionality or accuracy. For the rest, we quantify the tradeoff between privacy and utility and demonstrate that utility remains acceptable even with strong privacy protection.

[1]  Benjamin B. Kimia,et al.  Deblurring Gaussian blur , 2015, Comput. Vis. Graph. Image Process..

[2]  M. Turk,et al.  Eigenfaces for Recognition , 1991, Journal of Cognitive Neuroscience.

[3]  Andy Harter,et al.  Parameterisation of a stochastic model for human face identification , 1994, Proceedings of 1994 IEEE Workshop on Applications of Computer Vision.

[4]  A. ADoefaa,et al.  ? ? ? ? f ? ? ? ? ? , 2003 .

[5]  Dirk Heylen,et al.  Automatic Face Morphing for Transferring Facial Animation , 2003, Computer Graphics and Imaging.

[6]  Sharath Pankanti,et al.  Blinkering Surveillance: Enabling Video Privacy through Computer Vision , 2004 .

[7]  Bradley Malin,et al.  Preserving privacy by de-identifying face images , 2005, IEEE Transactions on Knowledge and Data Engineering.

[8]  Edoardo M. Airoldi,et al.  Integrating Utility into Face De-identification , 2005, Privacy Enhancing Technologies.

[9]  Cynthia Dwork,et al.  Differential Privacy , 2006, ICALP.

[10]  Ralph Gross,et al.  Model-Based Face De-Identification , 2006, 2006 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW'06).

[11]  Touradj Ebrahimi,et al.  Scrambling for Video Surveillance with Privacy , 2006, 2006 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW'06).

[12]  Nuno Vasconcelos,et al.  Privacy preserving crowd monitoring: Counting people without people models or tracking , 2008, 2008 IEEE Conference on Computer Vision and Pattern Recognition.

[13]  Tadayoshi Kohno,et al.  A spotlight on security and privacy risks with future household robots: attacks and lessons , 2009, UbiComp.

[14]  Frank McSherry,et al.  Privacy integrated queries: an extensible platform for privacy-preserving data analysis , 2009, SIGMOD Conference.

[15]  M. Calo People Can Be So Fake: A New Dimension to Privacy and Technology Scholarship , 2009 .

[16]  Jon Howell,et al.  What You See is What They Get: Protecting users from unwanted use of microphones, cameras, and other sensors , 2010 .

[17]  Benny Pinkas,et al.  SCiFI - A System for Secure Face Identification , 2010, 2010 IEEE Symposium on Security and Privacy.

[18]  C. Dwork A firm foundation for private data analysis , 2011, Commun. ACM.

[19]  Helen J. Wang,et al.  Enabling Fine-Grained Permissions for Augmented Reality Applications with Recognizers , 2013, USENIX Security Symposium.

[20]  Helen J. Wang,et al.  Operating System Support for Augmented Reality Applications , 2013, HotOS.

[21]  David J. Crandall,et al.  PlaceRaider: Virtual Theft in Physical Spaces with Smartphones , 2012, NDSS.