Averting Robot Eyes

Home robots will cause privacy harms. At the same time, they can provide beneficial services — as long as consumers trust them. This Essay evaluates potential technological solutions that could help home robots keep their promises, avert their eyes, and otherwise mitigate privacy harms. Our goals are to inform regulators of robot-related privacy harms and the available technological tools for mitigating them, and to spur technologists to employ existing tools and develop new ones by articulating principles for avoiding privacy harms. We posit that home robots will raise privacy problems of three basic types: (1) data privacy problems; (2) boundary management problems; and (3) social/relational problems. Technological design can ward off, if not fully prevent, a number of these harms. We propose five principles for home robots and privacy design: data minimization, purpose specifications, use limitations, honest anthropomorphism, and dynamic feedback and participation. We review current research into privacy-sensitive robotics, evaluating what technological solutions are feasible and where the harder problems lie. We close by contemplating legal frameworks that might encourage the implementation of such design, while also recognizing the potential costs of regulation at these early stages of the technology.

[1]  Robert C. Post Encryption Source Code and the First Amendment , 2000 .

[2]  John Travis Butler,et al.  Psychological Effects of Behavior Patterns of a Mobile Personal Robot , 2001, Auton. Robots.

[3]  Cipriano Galindo,et al.  Multi-hierarchical semantic maps for mobile robotics , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[4]  Adam Finkelstein,et al.  Directing gaze in 3D models with stylized focus , 2006, EGSR '06.

[5]  Adam Finkelstein,et al.  Interactive painterly stylization of images, videos and 3D animations , 2010, I3D '10.

[6]  Bilge Mutlu,et al.  Human-robot proxemics: Physical and psychological distancing in human-robot interaction , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[7]  Jodi Forlizzi,et al.  Understanding users! Perception of privacy in human-robot interaction , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[8]  Touradj Ebrahimi,et al.  Crowdsourcing approach for evaluation of privacy filters in video surveillance , 2012, CrowdMM '12.

[9]  Yingli Tian,et al.  Privacy Preserving Automatic Fall Detection for Elderly Using RGBD Cameras , 2012, ICCHP.

[10]  Ravi Kiran Sarvadevabhatla,et al.  Captain may I? Proxemics study examining factors that influence distance between humanoid robots, children, and adults, during human-robot interaction , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[11]  Howard Jay Chizeck,et al.  A method for constraint-based six degree-of-freedom haptic interaction with streaming point clouds , 2013, 2013 IEEE International Conference on Robotics and Automation.

[12]  Vanessa Evers,et al.  BEHAVE-II: The Revised Set of Measures to Assess Users’ Attitudinal and Behavioral Responses to a Social Robot , 2013, International Journal of Social Robotics.

[13]  Lauren E. Willis Performance-Based Consumer Law , 2014 .

[14]  Bernhard Rinner,et al.  Adaptive cartooning for privacy protection in camera networks , 2014, 2014 11th IEEE International Conference on Advanced Video and Signal Based Surveillance (AVSS).

[15]  Ashwin Machanavajjhala,et al.  MarkIt: privacy markers for protecting visual secrets , 2014, UbiComp Adjunct.

[16]  Robin R. Murphy,et al.  Evaluation of Proxemic Scaling Functions for Social Robotics , 2014, IEEE Transactions on Human-Machine Systems.

[17]  Ruonan Zhang,et al.  Video Manipulation Techniques for the Protection of Privacy in Remote Presence Systems , 2015, HRI.