6th senses for everyone!: the value of multimodal feedback in handheld navigation aids

One of the bottlenecks in today's pedestrian navigation system is to communicate the navigation instructions in an efficient but non-distracting way. Previous work has suggested tactile feedback as solution, but it is not yet clear how it should be integrated into handheld navigation systems to improve efficiency and reduce distraction. In this paper we investigate augmenting and replacing a state of the art pedestrian navigation system with tactile navigation instructions. In a field study in a lively city centre 21 participants had to reach given destinations by the means of tactile, visual or multimodal navigation instructions. In the tactile and multimodal conditions, the handheld device created vibration patterns indicating the direction of the next waypoint. Like a sixth sense it constantly gave the user an idea of how the route continues. The results provide evidence that combining both modalities leads to more efficient navigation performance while using tactile feedback only reduces the user's distraction.

[1]  Martin Pielot,et al.  Tactile Wayfinder: Comparison of Tactile Waypoint Navigation with Commercial Pedestrian Navigation Systems , 2010, Pervasive.

[2]  Stephen A. Brewster,et al.  Social gravity: a virtual elastic tether for casual, privacy-preserving pedestrian rendezvous , 2010, CHI.

[3]  Niels Henze,et al.  Supporting map-based wayfinding with tactile cues , 2009, Mobile HCI.

[4]  Anthony E. Richardson,et al.  Development of a self-report measure of environmental spatial ability. , 2002 .

[5]  Virpi Roto,et al.  Interaction in 4-second bursts: the fragmented nature of attentional resources in mobile HCI , 2005, CHI.

[6]  Robert Hardy,et al.  Design, implementation and evaluation of a novel public display for pedestrian navigation: the rotating compass , 2009, CHI.

[7]  Frode Eika Sandnes,et al.  Investigation into the feasibility of using tactons to provide navigation cues in pedestrian situations , 2008, OZCHI '08.

[8]  Lee Rainie,et al.  Adults and Cell Phone Distractions , 2010 .

[9]  Atsuyuki Okabe,et al.  Wayfinding with a GPS-based mobile navigation system: A comparison with maps and direct experience , 2008 .

[10]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[11]  Parisa Eslambolchilar,et al.  "I did it my way": moving away from the tyranny of turn-by-turn pedestrian navigation , 2010, Mobile HCI.

[12]  Manfred Tscheligi,et al.  Mobile navigation support for pedestrians: can it work and does it pay off? , 2006, INTR.

[13]  Linda R. Elliott,et al.  Field-Based Validation of a Tactile Navigation Device , 2010, IEEE Transactions on Haptics.

[14]  Peter Fröhlich,et al.  On the move, wirelessly connected to the world , 2011, Commun. ACM.

[15]  Martin Pielot,et al.  A Tactile Compass for Eyes-Free Pedestrian Navigation , 2011, INTERACT.

[16]  Alex Pentland,et al.  Tactual displays for wearable computing , 2005, Personal Technologies.

[17]  Charlotte Magnusson,et al.  The Influence of Angle Size in Navigation Applications Using Pointing Gestures , 2010, HAID.

[18]  Mark A. Neerincx,et al.  Effects of mobile map orientation and tactile feedback on navigation speed and situation awareness , 2008, Mobile HCI.

[19]  Koji Tsukada,et al.  ActiveBelt: Belt-Type Wearable Tactile Display for Directional Navigation , 2004, UbiComp.

[20]  Hong Z. Tan,et al.  Using spatial vibrotactile cues to direct visual attention in driving scenes , 2005 .

[21]  Stephen A. Brewster,et al.  Multimodal 'eyes-free' interaction techniques for wearable devices , 2003, CHI '03.