Efficient Multimodal Cuing of Spatial Attention

Behavioral studies of multisensory integration and cross-modal spatial attention have identified many potential benefits of using interfaces that engage more than just a single sense in complex operating environments. Particularly relevant in terms of application, the latest research highlights that: 1) multimodal signals can be used to reorient spatial attention effectively under conditions of high operator workload in which unimodal signals may be ineffective; 2) multimodal signals are less likely to be masked in noisy environments; and 3) there are natural links between specific signals and particular behavioral responses (e.g., head turning). However, taking advantage of these potential benefits requires that interface designers take into account the limitations of the human operator. In particular, multimodal interfaces should normally be designed so as to minimize any spatial incongruence between component warning signals presented in different sensory modalities that relate to the same event. Building on this rapidly growing cognitive neuroscience knowledge base, the last decade has witnessed the development of a number of highly effective multimodal interfaces for driving, aviation, the military, medicine, and sports.

[1]  Lars Eriksson,et al.  Enhanced Perception and Performance by Multimodal Threat Cueing in Simulated Combat Vehicle , 2012, Hum. Factors.

[2]  Charles Spence,et al.  The Multisensory Driver: Implications for Ergonomic Car Interface Design , 2012 .

[3]  C. Spence,et al.  Auditory, tactile, and multisensory cues facilitate search for dynamic visual stimuli , 2010, Attention, perception & psychophysics.

[4]  Daniel V. McGehee,et al.  Effects of Adaptive Cruise Control and Alert Modality on Driver Performance , 2009 .

[5]  Paul J. Laurienti,et al.  Semantic congruence is a critical factor in multisensory behavioral performance , 2004, Experimental Brain Research.

[6]  Charles Spence Crossmodal attention , 1998, Scholarpedia.

[7]  Charles Spence,et al.  Using Multisensory Cues to Facilitate Air Traffic Management , 2012, Hum. Factors.

[8]  Jon Driver,et al.  A new approach to the design of multimodal warning signals , 1999 .

[9]  Raymond J Kiefer,et al.  Direction coding using a tactile chair. , 2009, Applied ergonomics.

[10]  Rob Gray,et al.  How Do Batters Use Visual, Auditory, and Tactile Information About the Success of a Baseball Swing? , 2009, Research quarterly for exercise and sport.

[11]  D. Robinson,et al.  Shared neural control of attentional shifts and eye movements , 1996, Nature.

[12]  C. Spence,et al.  Crossmodal Space and Crossmodal Attention , 2004 .

[13]  J.B.F. van Erp,et al.  Application of tactile displays in sports : where to, how and when to move , 2006 .

[14]  Karon E. MacLean,et al.  Foundations of Transparency in Tactile Information Design , 2008, IEEE Transactions on Haptics.

[15]  Cristy Ho,et al.  Affective multisensory driver interface design , 2013 .

[16]  William W. Gaver Auditory Icons: Using Sound in Computer Interfaces , 1986, Hum. Comput. Interact..

[17]  Charles Spence,et al.  Multisensory warning signals: when spatial correspondence matters , 2009, Experimental Brain Research.

[18]  C. Spence Crossmodal spatial attention , 2010, Annals of the New York Academy of Sciences.

[19]  C. Spence,et al.  When hearing the bark helps to identify the dog: Semantically-congruent sounds modulate the identification of masked pictures , 2010, Cognition.

[20]  J. V. Erp,et al.  Vibrotactile in-vehicle navigation system , 2004 .

[21]  Thomas A. Dingus,et al.  Effects of Haptic Brake Pulse Warnings on Driver Behavior during an Intersection Approach , 2005 .

[22]  Thomas K. Ferris,et al.  Crossmodal Links in Attention in the Driving Environment: The Roles of Cueing Modality, Signal Timing, and Workload , 2008 .

[23]  C. Spence,et al.  Multisensory Integration: Space, Time and Superadditivity , 2005, Current Biology.

[24]  Cristy Ho,et al.  Using Peripersonal Warning Signals to Orient a Driver’s Gaze , 2009, Hum. Factors.

[25]  Charles Spence,et al.  Audiotactile interactions in near and far space , 2005, Experimental Brain Research.

[26]  Rouwen Cañal-Bruland,et al.  Training Perceptual Skill by Orienting Visual Attention , 2006 .

[27]  Theeuwes,et al.  Running Head : Pip and Pop Pip and pop : Non-spatial auditory signals improve spatial visual search , 2007 .

[28]  Nadine Sarter,et al.  Multimodal information presentation: Design guidance and research challenges , 2006 .

[29]  Jan Theeuwes,et al.  Pip and pop: nonspatial auditory signals improve spatial visual search. , 2008, Journal of experimental psychology. Human perception and performance.

[30]  Cristy Ho,et al.  Multisensory In-Car Warning Signals for Collision Avoidance , 2007, Hum. Factors.

[31]  Denis McKeown,et al.  Mapping Candidate Within-Vehicle Auditory Displays to Their Referents , 2007, Hum. Factors.

[32]  Charles Spence,et al.  Multisensory cues capture spatial attention regardless of perceptual load. , 2007, Journal of experimental psychology. Human perception and performance.

[33]  C. Spence,et al.  The cost of expecting events in the wrong sensory modality , 2001, Perception & psychophysics.

[34]  Alan H S Chan,et al.  Synchronous and asynchronous presentations of auditory and visual signals: Implications for control console design. , 2006, Applied ergonomics.

[35]  C. Spence,et al.  Multisensory Integration: Maintaining the Perception of Synchrony , 2003, Current Biology.

[36]  M. Wallace,et al.  Enhanced multisensory integration in older adults , 2006, Neurobiology of Aging.

[37]  F. Pavani,et al.  Action-specific remapping of peripersonal space , 2010, Neuropsychologia.

[38]  C. Spence,et al.  Crossmodal facilitation of masked visual target identification , 2010, Attention, perception & psychophysics.

[39]  Sarah Baillie,et al.  Developing the 'Ouch-o-Meter' to Teach Safe and Effective Use of Pressure for Palpation , 2008, EuroHaptics.

[40]  Sidney S. Simon,et al.  Merging of the Senses , 2008, Front. Neurosci..

[41]  John J. Foxe,et al.  Grabbing your ear: rapid auditory-somatosensory multisensory interactions in low-level sensory cortices are not constrained by stimulus alignment. , 2005, Cerebral cortex.

[42]  Jan B. F. van Erp,et al.  A Tactile Seat for Direction Coding in Car Driving: Field Evaluation , 2009, IEEE Transactions on Haptics.

[43]  T. Stanford,et al.  Multisensory integration: current issues from the perspective of the single neuron , 2008, Nature Reviews Neuroscience.

[44]  Konrad Paul Kording,et al.  Sensory Cue Integration , 2011 .

[45]  J. Driver,et al.  Audiovisual links in exogenous covert spatial orienting , 1997, Perception & psychophysics.

[46]  Sharon L. Oviatt,et al.  Ten myths of multimodal interaction , 1999, Commun. ACM.

[47]  Mark Wetton,et al.  The effects of driving experience on responses to a static hazard perception test. , 2012, Accident; analysis and prevention.

[48]  John J. Foxe,et al.  Do you see what I am saying? Exploring visual enhancement of speech comprehension in noisy environments. , 2006, Cerebral cortex.

[49]  Charles Spence,et al.  Multisensory Presence in Virtual Reality: Possibilities & Limitations. , 2011 .

[50]  David B. Kaber,et al.  Situation awareness implications of adaptive automation for information processing in an air traffic control-related task , 2006 .

[51]  C Spence,et al.  Multisensory interface design for drivers: past, present and future , 2008, Ergonomics.

[52]  C. Spence,et al.  The co-occurrence of multisensory competition and facilitation. , 2008, Acta psychologica.

[53]  Rouwen Cañal-Bruland,et al.  Guiding Visual Attention in Decision Making—Verbal Instructions Versus Flicker Cueing , 2009, Research quarterly for exercise and sport.

[54]  Brian M. Kleiner,et al.  Toward Developing an Approach for Alerting Drivers to the Direction of a Crash Threat , 2007, Hum. Factors.

[55]  Mark T. Wallace,et al.  Crossmodal spatial interactions in subcortical and cortical circuits , 2004 .

[56]  C. Spence,et al.  Attracting attention to the illusory location of a sound: reflexive crossmodal orienting and ventriloquism , 2000, Neuroreport.

[57]  Heath A. Ruff,et al.  Tactile versus Aural Redundant Alert Cues for UAV Control Applications , 2004 .

[58]  Rob Gray,et al.  Looming Auditory Collision Warnings for Driving , 2011, Hum. Factors.

[59]  Hong Z. Tan,et al.  Driver Reaction Time to Tactile and Auditory Rear-End Collision Warnings While Talking on a Cell Phone , 2009, Hum. Factors.

[60]  N. Lavie Distracted and confused?: Selective attention under load , 2005, Trends in Cognitive Sciences.

[61]  Cristy Ho,et al.  Tactile and Multisensory Spatial Warning Signals for Drivers , 2008, IEEE Transactions on Haptics.

[62]  Karon E. MacLean,et al.  The haptic crayola effect: Exploring the role of naming in learning haptic stimuli , 2011, 2011 IEEE World Haptics Conference.

[63]  J A Veltman,et al.  Tactile cueing effects on performance in simulated aerial combat with high acceleration. , 2007, Aviation, space, and environmental medicine.

[64]  Lynette A. Jones,et al.  Tactile display and vibrotactile pattern recognition on the torso , 2006, Adv. Robotics.

[65]  M. Ernst,et al.  Humans integrate visual and haptic information in a statistically optimal fashion , 2002, Nature.

[66]  Thomas K. Ferris,et al.  Continuously Informing Vibrotactile Displays in Support of Attention Management and Multitasking in Anesthesiology , 2011, Hum. Factors.

[67]  Charles Spence,et al.  Audiotactile multisensory interactions in human information processing , 2006 .