A Tactile Compass for Eyes-Free Pedestrian Navigation

This paper reports from the first systematic investigation on how to guide people to a destination using the haptic feedback of a mobile phone and its experimental evaluation. The aim was to find a navigation aid that works hands-free, reduces the users' distraction, and can be realised with widely available handheld devices. To explore the design space we developed and tested different prototypes. Drawing on the results of these tests we present the concept of a tactile compass, which encodes the direction of a location "as the crow flies" in rhythmic patterns and its distance in the pause between two patterns. This paper also reports from the first experimental comparison of such tactile displays with visual navigation systems. The tactile compass was used to continuously display the location of a destination from the user's perspective (e.g. ahead, close). In a field experiment including the tactile compass and an interactive map three conditions were investigated: tactile only, visual only, and combined. The results provide evidence that cueing spatial locations in vibration patterns can form an effective and efficient navigation aid. Between the conditions, no significant differences in the navigation performance were found. The tactile compass used alone could significantly reduce the amount of distractive interaction and together with the map it improved the participants' confidence in the navigation system.

[1]  Martin Pielot,et al.  Tactile Wayfinder: Comparison of Tactile Waypoint Navigation with Commercial Pedestrian Navigation Systems , 2010, Pervasive.

[2]  Alex Pentland,et al.  Tactual displays for wearable computing , 2005, Personal Technologies.

[3]  C. Lawton Gender differences in way-finding strategies: Relationship to spatial ability and spatial anxiety , 1994 .

[4]  Charlotte Magnusson,et al.  The Influence of Angle Size in Navigation Applications Using Pointing Gestures , 2010, HAID.

[5]  Mark A. Neerincx,et al.  Effects of mobile map orientation and tactile feedback on navigation speed and situation awareness , 2008, Mobile HCI.

[6]  Lee Rainie,et al.  Adults and Cell Phone Distractions , 2010 .

[7]  Atsuyuki Okabe,et al.  Wayfinding with a GPS-based mobile navigation system: A comparison with maps and direct experience , 2008 .

[8]  Martin Pielot,et al.  Where is my team: supporting situation awareness with tactile displays , 2010, CHI.

[9]  William Seager,et al.  Comparing physical, automatic and manual map rotation for pedestrian navigation , 2007, CHI.

[10]  David R. Morse,et al.  AudioGPS: Spatial Audio Navigation with a Minimal Attention Interface , 2002, Personal and Ubiquitous Computing.

[11]  Andreas Riener,et al.  Vibro-Tactile Space-Awareness , 2008 .

[12]  Manfred Tscheligi,et al.  Mobile navigation support for pedestrians: can it work and does it pay off? , 2006, INTR.

[13]  Lorna M. Brown,et al.  Non-visual information display using tactons , 2004, CHI EA '04.

[14]  Robert W. Lindeman,et al.  Effectiveness of directional vibrotactile cuing on a building-clearing task , 2005, CHI.

[15]  Stephen A. Brewster,et al.  Social gravity: a virtual elastic tether for casual, privacy-preserving pedestrian rendezvous , 2010, CHI.

[16]  J. V. Van Erp,et al.  Presenting directions with a vibrotactile torso display , 2005, Ergonomics.

[17]  Jan B. F. van Erp,et al.  Presenting directions with a vibrotactile torso display. , 2005 .

[18]  Rolf Nordahl,et al.  Haptic and Audio Interaction Design - 5th International Workshop, HAID 2010, Copenhagen, Denmark, September 16-17, 2010. Proceedings , 2010, HAID.

[19]  Lorna M. Brown,et al.  Tactons: Structured Tactile Messages for Non-Visual Information Display , 2004, AUIC.

[20]  Robert Hardy,et al.  Design, implementation and evaluation of a novel public display for pedestrian navigation: the rotating compass , 2009, CHI.

[21]  Edna Platzer Spatial cognition research: the human navigation process and its comparability in complex real and virtual environments , 2005 .

[22]  Matt Jones,et al.  Sweep-Shake: finding digital resources in physical environments , 2009, Mobile HCI.

[23]  Charlotte Magnusson,et al.  Soundcrumbs – Hansel and Gretel in the 21st century , 2009 .

[24]  Hendrik A. H. C. van Veen,et al.  Waypoint navigation with a vibrotactile waist belt , 2005, TAP.

[25]  Antonio Krüger,et al.  Acquisition of spatial knowledge in location aware mobile pedestrian navigation systems , 2006, Mobile HCI.

[26]  Nigel Davies,et al.  UbiComp 2004: Ubiquitous Computing , 2004, Lecture Notes in Computer Science.

[27]  Parisa Eslambolchilar,et al.  "I did it my way": moving away from the tyranny of turn-by-turn pedestrian navigation , 2010, Mobile HCI.

[28]  Linda R. Elliott,et al.  Field-Based Validation of a Tactile Navigation Device , 2010, IEEE Transactions on Haptics.

[29]  M. Sile O'Modhrain,et al.  GpsTunes: controlling navigation via audio feedback , 2005, Mobile HCI.

[30]  Phoebe Sengers,et al.  In-car gps navigation: engagement with and disengagement from the environment , 2008, CHI.

[31]  Niels Henze,et al.  Supporting map-based wayfinding with tactile cues , 2009, Mobile HCI.

[32]  Anthony E. Richardson,et al.  Development of a self-report measure of environmental spatial ability. , 2002 .

[33]  Koji Tsukada,et al.  ActiveBelt: Belt-Type Wearable Tactile Display for Directional Navigation , 2004, UbiComp.