Interweaving Visual and Audio-Haptic Augmented Reality for Urban Exploration

While ordinary touchscreen-based interfaces on urban explorer applications draw much of a user's attention onto the screen, visual and audio-haptic augmented reality interfaces have emerged as the two main streams for enabling direct focus on the surroundings. However, neither interface alone satisfies users in the highly dynamic urban environment. This research investigates how the two complementary augmentation can coexist on one system and how people adapt to the situation by selecting the more suitable interface. A prototype was deployed in a field experiment in which participants explored points of interest in an urban environment with both interfaces. The engagement with the surroundings was compared with a touchscreen-based application. Most participants spontaneously switched between the two interfaces, which manifests the value of the availability of both interfaces on one system. The results point at the situated advantages of either interface and reveal the users' preferences when both interfaces are available.

[1]  Rod McCall,et al.  Short Paper: User Study for Mobile Mixed Reality Devices , 2010, EGVE/EuroVR/VEC.

[2]  Eamonn O'Neill,et al.  A comparative study of tactile representation techniques for landmarks on a wearable device , 2011, CHI.

[3]  Jan Stage,et al.  Handbook of Research on User Interface Design and Evaluation for Mobile Technology , 2008 .

[4]  L. Henkel,et al.  Point-and-Shoot Memories , 2014, Psychological science.

[5]  George Buchanan,et al.  Improving web search on small screen devices , 2003, Interact. Comput..

[6]  Antti Jylhä,et al.  Designing a Willing-to-Use-in-Public Hand Gestural Interaction Technique for Smart Glasses , 2016, CHI.

[7]  David K. McGookin,et al.  Phases of Urban Tourists' Exploratory Navigation: A Field Study , 2016, Conference on Designing Interactive Systems.

[8]  Junbo Wang,et al.  Magic Ring: a self-contained gesture input device on finger , 2013, MUM.

[9]  Martin Pielot,et al.  Dude, where's my car?: in-situ evaluation of a tactile car finder , 2012, NordiCHI.

[10]  F. A. Geldard Adventures in tactile literacy. , 1957 .

[11]  Blair MacIntyre,et al.  Experiences with an AR evaluation test bed: Presence, performance, and physiological measurement , 2010, 2010 IEEE International Symposium on Mixed and Augmented Reality.

[12]  Michel Beaudouin-Lafon,et al.  Designing interaction, not interfaces , 2004, AVI.

[13]  Mathieu Nancel,et al.  Myopoint: Pointing and Clicking Using Forearm Mounted Electromyography and Inertial Motion Sensors , 2015, CHI.

[14]  Jose L. Contreras-Vidal,et al.  Understanding One-Handed Use of Mobile Devices , 2008 .

[15]  Ian Oakley,et al.  Auditory display design for exploration in mobile audio-augmented reality , 2012, Personal and Ubiquitous Computing.

[16]  Julia Snell,et al.  Interrogating video data: systematic quantitative analysis versus micro‐ethnographic analysis , 2011 .

[17]  Shigeru Haga,et al.  Effects of using a Smart Phone on Pedestrians’ Attention and Walking☆ , 2015 .

[18]  Richard Wener,et al.  Mobile telephones, distracted attention, and pedestrian safety. , 2008, Accident; analysis and prevention.

[19]  Niels Henze,et al.  Supporting map-based wayfinding with tactile cues , 2009, Mobile HCI.

[20]  Vincent Lepetit,et al.  Point-and-shoot for ubiquitous tagging on mobile phones , 2010, 2010 IEEE International Symposium on Mixed and Augmented Reality.

[21]  Lynette A. Jones,et al.  Tactile Displays: Guidance for Their Design and Application , 2008, Hum. Factors.

[22]  Gang Ren,et al.  Haptic and audio displays for augmented reality tourism applications , 2014, 2014 IEEE Haptics Symposium (HAPTICS).

[23]  Bruce H. Thomas,et al.  Mobile Collaborative Augmented Reality , 2011 .

[24]  Alistair A. Young,et al.  Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) , 2017, MICCAI 2017.

[25]  Diogo Cabral,et al.  NARI: Natural Augmented Reality Interface - Interaction Challenges for AR Applications , 2015, GRAPP.

[26]  Peter Fröhlich,et al.  On the move, wirelessly connected to the world , 2011, Commun. ACM.

[27]  Andrew Dillon,et al.  The effects of display size and text splitting on reading lengthy text from screen , 1990 .

[28]  Andreas Komninos,et al.  Urban exploration using audio scents , 2012, Mobile HCI.

[29]  Jaewoo Chung,et al.  Mindful navigation for pedestrians: Improving engagement with augmented reality , 2016 .

[30]  Ying-Chao Tung,et al.  User-Defined Game Input for Smart Glasses in Public Space , 2015, CHI.

[31]  Charlotte Magnusson,et al.  Exploring history: a mobile inclusive virtual tourist guide , 2014, NordiCHI.

[32]  Anupriya Ankolekar,et al.  Play it by ear: a case for serendipitous discovery of places with musicons , 2013, CHI.

[33]  Stephen DiVerdi,et al.  Evaluating Display Types for AR Selection and Annotation , 2007, 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality.

[34]  Xu Jia,et al.  How users manipulate deformable displays as input devices , 2010, CHI.

[35]  Thomas Grechenig,et al.  Comparing viewing and filtering techniques for mobile urban exploration , 2011, J. Locat. Based Serv..

[36]  Stephen A. Brewster,et al.  Audio Bubbles: Employing Non-speech Audio to Support Tourist Wayfinding , 2009, HAID.

[37]  Rainer Stiefelhagen,et al.  How to Click in Mid-Air , 2013, HCI.

[38]  Dimitrios Buhalis,et al.  Smartphone Augmented Reality Applications for Tourism , 2012 .

[39]  Antti Jylhä,et al.  A Wearable Multimodal Interface for Exploring Urban Points of Interest , 2015, ICMI.

[40]  Charlotte Magnusson,et al.  Scanning angles for directional pointing , 2010, Mobile HCI.

[41]  I. Hyman,et al.  Did you see the unicycling clown? Inattentional blindness while walking and talking on a cell phone , 2009 .

[42]  Matthew Turk,et al.  Multimodal interaction: A review , 2014, Pattern Recognit. Lett..