SNaSI: Social Navigation through Subtle Interactions with an AI agent

Technology advances have set the stage for intelligent visual agents, with many initial applications being created for people who are blind or have low vision. While most focus on spatial navigation, recent literature suggests that supporting social navigation could be particularly powerful by providing appropriate cues that allow blind and low vision people to enter into and sustain social interaction. › A particularly poignant design challenge to enable social naviga tion is managing agent interaction in a way that augments rather than disturbs social interaction. Usage of existing agent-like technologies have surfaced some of the difficulties in this regard. In particular, it is difficult to talk to a person when an agent is speaking to them. It is also difficult to speak with someone fiddling with a device to manipulate their agent. In this paper we present SNaSI, a wearable designed to provoke the thinking process around how we support social navigation through sub tle interaction. Specifically, we are interested to generate thinking about the triangular relationship between a blind user, an com munication partner and the system containing an AI agent. We explore how notions of subtlety, but not invisibility, can enable this triadic relationship. SNaSI builds upon previous research on sen sory substitution and the work of Bach-y-Rita (Bach-y-Rita 2003) but explores those ideas in the form of a social instrument. Rébecca Kleinberger1, Joshua Huburn2, Martin Grayson3 , Cecily Morrison4 1 MIT Media Lab, Cambridge, US & Microsoft Research, HXD, Cambridge, UK rebklein@media.mit.edu 2, 3, 4 Microsoft Research HXD, Cambridge, UK, jhuburn@gmail.com mgrayson@microsoft.com cecilym@microsoft.com

[1]  Hiroshi Ishii,et al.  Ambient Displays: Turning Architectural Space into an Interface between People and Digital Information , 1998, CoBuild.

[2]  P. Bach-y-Rita,et al.  Sensory substitution and the human–machine interface , 2003, Trends in Cognitive Sciences.

[3]  R. Gassert,et al.  Augmented white cane with multimodal haptic feedback , 2010, 2010 3rd IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics.

[4]  A. Fiannaca,et al.  Headlock: a wearable navigation aid that helps blind cane users traverse large open spaces , 2014, ASSETS.

[5]  Alan Borning,et al.  Where's my bus stop?: supporting independence of blind transit riders with StopInfo , 2014, ASSETS.

[6]  Rabia Jafri,et al.  Exploring the potential of eyewear-based wearable display devices for use by the visually impaired , 2014, 2014 3rd International Conference on User Science and Engineering (i-USEr).

[7]  Sethuraman Panchanathan,et al.  Visual-to-tactile mapping of facial movements for enriched social interactions , 2014, 2014 IEEE International Symposium on Haptic, Audio and Visual Environments and Games (HAVE) Proceedings.

[8]  Ieee Staff,et al.  2015 IEEE International Symposium on Haptic, Audio and Visual Environments and Games (HAVE) , 2015 .

[9]  Juan Ye,et al.  Capturing social cues with imaging glasses , 2016, UbiComp Adjunct.

[10]  Abigail Sellen,et al.  "Like Having a Really Bad PA": The Gulf between User Expectation and Experience of Conversational Agents , 2016, CHI.

[11]  Sethuraman Panchanathan,et al.  Social Interaction Assistant: A Person-Centered Approach to Enrich Social Interactions for Individuals With Visual Impairments , 2016, IEEE Journal of Selected Topics in Signal Processing.

[12]  Edward Cutrell,et al.  Imagining Artificial Intelligence Applications with People with Visual Disabilities using Tactile Ideation , 2017, ASSETS.

[13]  Chieko Asakawa,et al.  People with Visual Impairment Training Personal Object Recognizers: Feasibility and Challenges , 2017, CHI.

[14]  Hironobu Takagi,et al.  NavCog3: An Evaluation of a Smartphone-Based Blind Indoor Navigation Assistant with Semantic Features in a Large-Scale Environment , 2017, ASSETS.

[15]  Roberto Manduchi,et al.  Easy Return: An App for Indoor Backtracking Assistance , 2018, CHI.

[16]  Meredith Ringel Morris,et al.  Enabling People with Visual Impairments to Navigate Virtual Reality with a Haptic and Auditory Cane Simulation , 2018, CHI.

[17]  Anke M. Brock,et al.  Towards a Multisensory Augmented Reality Map for Blind and Low Vision People: a Participatory Design Approach , 2018, CHI.

[18]  Leah Findlater,et al.  "Accessibility Came by Accident": Use of Voice-Controlled Intelligent Personal Assistants by People with Disabilities , 2018, CHI.