Efficient Multimodal Cuing of Spatial Attention In this paper, the authors detail how attentional distribution can work in the important example of multimodal spatial cuing, and ground it in several classes of applications.

Behavioral studies of multisensory integration and cross-modal spatial attention have identified many poten- tial benefits of using interfaces that engage more than just a single sense in complex operating environments. Particularly relevant in terms of application, the latest research highlights that: 1) multimodal signals can be used to reorient spatial attention effectively under conditions of high operator work- load in which unimodal signals may be ineffective; 2) multimod- al signals are less likely to be masked in noisy environments; and 3) there are natural links between specific signals and particular behavioral responses (e.g., head turning). However, taking advantage of these potential benefits requires that interface designers take into account the limitations of the human operator. In particular, multimodal interfaces should normally be designed so as to minimize any spatial incongru- ence between component warning signals presented in differ- ent sensory modalities that relate to the same event. Building on this rapidly growing cognitive neuroscience knowledge base, the last decade has witnessed the development of a number of highly effective multimodal interfaces for driving, aviation, the military, medicine, and sports.

[1]  Heath A. Ruff,et al.  Tactile versus Aural Redundant Alert Cues for UAV Control Applications , 2004 .

[2]  J. V. Erp,et al.  Vibrotactile in-vehicle navigation system , 2004 .

[3]  C. Spence,et al.  The cost of expecting events in the wrong sensory modality , 2001, Perception & psychophysics.

[4]  Raymond J Kiefer,et al.  Direction coding using a tactile chair. , 2009, Applied ergonomics.

[5]  Rob Gray,et al.  How Do Batters Use Visual, Auditory, and Tactile Information About the Success of a Baseball Swing? , 2009, Research quarterly for exercise and sport.

[6]  C. Spence,et al.  Multisensory Integration: Maintaining the Perception of Synchrony , 2003, Current Biology.

[7]  C. Spence,et al.  Auditory, tactile, and multisensory cues facilitate search for dynamic visual stimuli , 2010, Attention, perception & psychophysics.

[8]  C. Spence,et al.  Crossmodal Space and Crossmodal Attention , 2004 .

[9]  Thomas K. Ferris,et al.  Crossmodal Links in Attention in the Driving Environment: The Roles of Cueing Modality, Signal Timing, and Workload , 2008 .

[10]  Cristy Ho,et al.  Using Peripersonal Warning Signals to Orient a Driver’s Gaze , 2009, Hum. Factors.

[11]  Charles Spence,et al.  Audiotactile interactions in near and far space , 2005, Experimental Brain Research.

[12]  Jon Driver,et al.  A new approach to the design of multimodal warning signals , 1999 .

[13]  D. Robinson,et al.  Shared neural control of attentional shifts and eye movements , 1996, Nature.

[14]  T. Stanford,et al.  Multisensory integration: current issues from the perspective of the single neuron , 2008, Nature Reviews Neuroscience.

[15]  Charles Spence,et al.  Audiotactile multisensory interactions in human information processing , 2006 .

[16]  M. Wallace,et al.  Enhanced multisensory integration in older adults , 2006, Neurobiology of Aging.

[17]  William W. Gaver Auditory Icons: Using Sound in Computer Interfaces , 1986, Hum. Comput. Interact..

[18]  Sarah Baillie,et al.  Developing the 'Ouch-o-Meter' to Teach Safe and Effective Use of Pressure for Palpation , 2008, EuroHaptics.

[19]  Charles Spence,et al.  Multisensory warning signals: when spatial correspondence matters , 2009, Experimental Brain Research.

[20]  C. Spence Crossmodal spatial attention , 2010, Annals of the New York Academy of Sciences.

[21]  C. Spence,et al.  When hearing the bark helps to identify the dog: Semantically-congruent sounds modulate the identification of masked pictures , 2010, Cognition.

[22]  J.B.F. van Erp,et al.  Application of tactile displays in sports : where to, how and when to move , 2006 .

[23]  Karon E. MacLean,et al.  Foundations of Transparency in Tactile Information Design , 2008, IEEE Transactions on Haptics.

[24]  Cristy Ho,et al.  Affective multisensory driver interface design , 2013 .

[25]  Hong Z. Tan,et al.  Driver Reaction Time to Tactile and Auditory Rear-End Collision Warnings While Talking on a Cell Phone , 2009, Hum. Factors.

[26]  N. Lavie Distracted and confused?: Selective attention under load , 2005, Trends in Cognitive Sciences.

[27]  Charles Spence,et al.  Multisensory cues capture spatial attention regardless of perceptual load. , 2007, Journal of experimental psychology. Human perception and performance.

[28]  Cristy Ho,et al.  Tactile and Multisensory Spatial Warning Signals for Drivers , 2008, IEEE Transactions on Haptics.

[29]  Alan H S Chan,et al.  Synchronous and asynchronous presentations of auditory and visual signals: Implications for control console design. , 2006, Applied ergonomics.

[30]  Charles Spence,et al.  Using Multisensory Cues to Facilitate Air Traffic Management , 2012, Hum. Factors.

[31]  Karon E. MacLean,et al.  The haptic crayola effect: Exploring the role of naming in learning haptic stimuli , 2011, 2011 IEEE World Haptics Conference.

[32]  John J. Foxe,et al.  Do you see what I am saying? Exploring visual enhancement of speech comprehension in noisy environments. , 2006, Cerebral cortex.

[33]  C Spence,et al.  Multisensory interface design for drivers: past, present and future , 2008, Ergonomics.

[34]  C. Spence,et al.  The co-occurrence of multisensory competition and facilitation. , 2008, Acta psychologica.

[35]  Rouwen Cañal-Bruland,et al.  Guiding Visual Attention in Decision Making—Verbal Instructions Versus Flicker Cueing , 2009, Research quarterly for exercise and sport.

[36]  Nadine Sarter,et al.  Multimodal information presentation: Design guidance and research challenges , 2006 .

[37]  John J. Foxe,et al.  Grabbing your ear: rapid auditory-somatosensory multisensory interactions in low-level sensory cortices are not constrained by stimulus alignment. , 2005, Cerebral cortex.

[38]  B. Stein,et al.  The Merging of the Senses , 1993 .

[39]  Rob Gray,et al.  Looming Auditory Collision Warnings for Driving , 2011, Hum. Factors.

[40]  F. Pavani,et al.  Action-specific remapping of peripersonal space , 2010, Neuropsychologia.

[41]  C. Spence,et al.  Crossmodal facilitation of masked visual target identification , 2010, Attention, perception & psychophysics.

[42]  Sharon L. Oviatt,et al.  Ten myths of multimodal interaction , 1999, Commun. ACM.

[43]  Mark Wetton,et al.  The effects of driving experience on responses to a static hazard perception test. , 2012, Accident; analysis and prevention.

[44]  Mark T. Wallace,et al.  Crossmodal spatial interactions in subcortical and cortical circuits , 2004 .

[45]  C. Spence,et al.  Attracting attention to the illusory location of a sound: reflexive crossmodal orienting and ventriloquism , 2000, Neuroreport.

[46]  Lars Eriksson,et al.  Enhanced Perception and Performance by Multimodal Threat Cueing in Simulated Combat Vehicle , 2012, Hum. Factors.

[47]  Charles Spence,et al.  The Multisensory Driver: Implications for Ergonomic Car Interface Design , 2012 .

[48]  Jan Theeuwes,et al.  Pip and pop: nonspatial auditory signals improve spatial visual search. , 2008, Journal of experimental psychology. Human perception and performance.

[49]  Daniel V. McGehee,et al.  Effects of Adaptive Cruise Control and Alert Modality on Driver Performance , 2009 .

[50]  Paul J. Laurienti,et al.  Semantic congruence is a critical factor in multisensory behavioral performance , 2004, Experimental Brain Research.

[51]  J A Veltman,et al.  Tactile cueing effects on performance in simulated aerial combat with high acceleration. , 2007, Aviation, space, and environmental medicine.

[52]  Lynette A. Jones,et al.  Tactile display and vibrotactile pattern recognition on the torso , 2006, Adv. Robotics.

[53]  M. Ernst,et al.  Humans integrate visual and haptic information in a statistically optimal fashion , 2002, Nature.

[54]  Thomas A. Dingus,et al.  Effects of Haptic Brake Pulse Warnings on Driver Behavior during an Intersection Approach , 2005 .

[55]  C. Spence,et al.  Multisensory Integration: Space, Time and Superadditivity , 2005, Current Biology.

[56]  Cristy Ho,et al.  Multisensory In-Car Warning Signals for Collision Avoidance , 2007, Hum. Factors.

[57]  Charles Spence,et al.  Multisensory Presence in Virtual Reality: Possibilities & Limitations. , 2011 .

[58]  David B. Kaber,et al.  Situation awareness implications of adaptive automation for information processing in an air traffic control-related task , 2006 .

[59]  Rouwen Cañal-Bruland,et al.  Training Perceptual Skill by Orienting Visual Attention , 2006 .

[60]  Jan B. F. van Erp,et al.  A Tactile Seat for Direction Coding in Car Driving: Field Evaluation , 2009, IEEE Transactions on Haptics.

[61]  J. Driver,et al.  Audiovisual links in exogenous covert spatial orienting , 1997, Perception & psychophysics.