"Hey Model!" – Natural User Interactions and Agency in Accessible Interactive 3D Models

While developments in 3D printing have opened up opportunities for improved access to graphical information for people who are blind or have low vision (BLV), they can provide only limited detailed and contextual information. Interactive 3D printed models (I3Ms) that provide audio labels and/or a conversational agent interface potentially overcome this limitation. We conducted a Wizard-of-Oz exploratory study to uncover the multi-modal interaction techniques that BLV people would like to use when exploring I3Ms, and investigated their attitudes towards different levels of model agency. These findings informed the creation of an I3M prototype of the solar system. A second user study with this model revealed a hierarchy of interaction, with BLV users preferring tactile exploration, followed by touch gestures to trigger audio labels, and then natural language to fill in knowledge gaps and confirm understanding.

[1]  Jonathan Rowell,et al.  The world of touch: an international survey of tactile maps. Part 1: production , 2003 .

[2]  James C. Lester,et al.  The Case for Social Agency in Computer-Based Teaching: Do Students Learn More Deeply When They Interact With Animated Pedagogical Agents? , 2001 .

[3]  Margot Brereton,et al.  Use of voice activated interfaces by people with intellectual disability , 2018, OZCHI.

[4]  Tom Yeh,et al.  Transcribing Across the Senses: Community Efforts to Create 3D Printable Accessible Tactile Pictures for Young Children with Visual Impairments , 2015, ASSETS.

[5]  Christoph Schlieder,et al.  TeDUB: A System for Presenting and Exploring Technical Drawings for Blind People , 2002, ICCHP.

[6]  Lei Shi,et al.  Designing Interactions for 3D Printed Models with Blind People , 2017, ASSETS.

[7]  Jeffrey P. Bigham,et al.  Tracking @stemxcomet: teaching programming to blind students via 3D printing, crisis management, and twitter , 2014, SIGCSE.

[8]  Leona Holloway,et al.  Accessible Maps for the Blind: Comparing 3D Printed Models with Tactile Graphics , 2018, CHI.

[9]  Noah L. Schroeder,et al.  How Effective are Pedagogical Agents for Learning? A Meta-Analytic Review , 2013 .

[10]  I. V. Ramakrishnan,et al.  Write-it-Yourself with the Aid of Smartwatches: A Wizard-of-Oz Experiment with Blind People , 2018, IUI.

[11]  Timo Götzelmann,et al.  Towards Automatically Generated Tactile Detail Maps by 3D Printers for Blind Persons , 2014, ICCHP.

[12]  Sethuraman Panchanathan,et al.  A Wizard of Oz Study Exploring How Agreement/Disagreement Nonverbal Cues Enhance Social Interactions for Individuals Who Are Blind , 2014, HCI.

[13]  Christian S. Hamann,et al.  Applied Computational Chemistry for the Blind and Visually Impaired. , 2012 .

[14]  Leona Holloway,et al.  Understanding the graphical challenges faced by vision-impaired students in Australian universities , 2017 .

[15]  Timothy Bickmore,et al.  An Internet-Based Virtual Coach to Promote Physical Activity Adherence in Overweight Adults: Randomized Controlled Trial , 2012, Journal of medical Internet research.

[16]  Takayuki Kanda,et al.  Design patterns for sociality in human-robot interaction , 2008, 2008 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[17]  Amy Hurst,et al.  VizTouch: automatically generated tactile visualizations of coordinate spaces , 2012, TEI.

[18]  Anke M. Brock,et al.  Map Learning with a 3D Printed Interactive Small-Scale Model: Improvement of Space and Text Memorization in Visually Impaired Students , 2017, Front. Psychol..

[19]  Michele Hu,et al.  Exploring New Paradigms for Accessible 3D Printed Graphs , 2015, ASSETS.

[20]  Amy Hurst,et al.  Tactile aids for visually impaired graphical design education , 2014, ASSETS.

[21]  Lei Shi,et al.  Markit and Talkit: A Low-Barrier Toolkit to Augment 3D Printed Models with Audio Annotations , 2017, UIST.

[22]  Ravi Kuber,et al.  "Siri Talks at You": An Empirical Investigation of Voice-Activated Personal Assistant (VAPA) Usage by Individuals Who Are Blind , 2018, ASSETS.

[23]  Kim Marriott,et al.  GraVVITAS: Generic Multi-touch Presentation of Accessible Graphics , 2011, INTERACT.

[24]  Helen Petrie,et al.  Exploring Map Orientation with Interactive Audio-Tactile Maps , 2015, INTERACT.

[25]  James A. Larson,et al.  Guidelines for multimodal user interface design , 2004, CACM.

[26]  Frédéric Valentin,et al.  MapSense: Multi-Sensory Interactive Maps for Children Living with Visual Impairments , 2016, CHI.

[27]  David Akers,et al.  Wizard of Oz for participatory design: inventing a gestural interface for 3D selection of neural pathway estimates , 2006, CHI Extended Abstracts.

[28]  Patricia D. Mautone,et al.  Social cues in multimedia learning: Role of speaker's voice. , 2003 .

[29]  Joaquim Lloveras,et al.  Visual Impairment and urban orientation. Pilot study with tactile maps produced through 3D Printing , 2012 .

[30]  Lei Shi,et al.  Tickers and Talker: An Accessible Labeling Toolkit for 3D Printed Models , 2016, CHI.

[31]  Martin Pielot,et al.  TouchOver map: audio-tactile exploration of interactive maps , 2011, Mobile HCI.

[32]  Adam Fourney,et al.  Exploring the Role of Conversational Cues in Guided Task Support with Virtual Assistants , 2018, CHI.

[33]  Amy Hurst,et al.  Investigating the Implications of 3D Printing in Special Education , 2016, ACM Trans. Access. Comput..

[34]  Justine Cassell,et al.  Embodied conversational interface agents , 2000, CACM.

[35]  J. Cassell,et al.  More Than Just Another Pretty Face: Embodied Conversational Interface Agents , 1999 .

[36]  Lei Shi,et al.  Designing Interactive 3D Printed Models with Teachers of the Visually Impaired , 2019, CHI.

[37]  Alistair D. N. Edwards,et al.  Facilitating route learning using interactive audio-tactile maps for blind and visually impaired people , 2013, CHI Extended Abstracts.

[38]  Timo Götzelmann,et al.  LucentMaps: 3D Printed Audiovisual Tactile Maps for Blind and Visually Impaired People , 2016, ASSETS.

[39]  Dana Kulic,et al.  Measurement Instruments for the Anthropomorphism, Animacy, Likeability, Perceived Intelligence, and Perceived Safety of Robots , 2009, Int. J. Soc. Robotics.

[40]  Daniel P. Siewiorek,et al.  TactileMaps.net: A Web Interface for Generating Customized 3D-Printable Tactile Maps , 2015, ASSETS.

[41]  Marcos Báez,et al.  Smart Conversational Agents for Reminiscence , 2018, 2018 IEEE/ACM 1st International Workshop on Software Engineering for Cognitive Services (SE4COG).

[42]  Anton L. Fuhrmann,et al.  Gesture-Based Interactive Audio Guide on Tactile Reliefs , 2016, ASSETS.

[43]  Leah Findlater,et al.  "Accessibility Came by Accident": Use of Voice-Controlled Intelligent Personal Assistants by People with Disabilities , 2018, CHI.

[44]  Shiri Azenkot,et al.  Exploring the use of speech input by blind people on mobile devices , 2013, ASSETS.

[45]  Jessica A. Chen,et al.  Conversational agents in healthcare: a systematic review , 2018, J. Am. Medical Informatics Assoc..

[46]  Tom Yeh,et al.  Toward 3D-Printed Movable Tactile Pictures for Children with Visual Impairments , 2015, CHI.

[47]  J. F. Kelley,et al.  An iterative design methodology for user-friendly natural language office information applications , 1984, TOIS.

[48]  I. V. Ramakrishnan,et al.  Wizard-of-Oz evaluation of speech-driven web browsing interface for people with vision impairments , 2014, W4A.

[49]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[50]  C. Sidner,et al.  Automated interventions for multiple health behaviors using conversational agents. , 2013, Patient education and counseling.

[51]  Barbara Leporini,et al.  Enabling Access to Cultural Heritage for the visually impaired: an Interactive 3D model of a Cultural Site , 2018, ANT/SEIT.